What We Think: Investment, the Growth of the Digital Sector, and our Rights

As part of the work we have been doing for the USAID funded Scoping Digital Markets Investments Towards Bolstering Individual Rights project we have been undertaking a literature review.  It’s purpose is for us to draw on existing literature in the field of digital rights and use the documentation to identify tensions between investment, growth and human rights.  Of the 27 documents we reviewed there were three key technologies which appeared frequently across the literature and are of concern:

  1. Social Media networks
  2. Algorithmic Systems
  3. Biometric based technologies

 

So what does this mean for our rights? Our research also shows that Right to Privacy and Right to Freedom of Expression are the most commonly cited violated rights across the technologies identified in the review. However, it is important to note that all rights are intertwined and interconnected, and that if one right is highly impacted, it will definitely have an impact on another. 

 

Social Media: As the use of social media grows, so do the threats to human rights. Everyone should have the right to freedom of expression but this is not always the case online with censorship rife, surveillance common place and blocking of information reaching the final user. 

 

Algorithmic systems: Algorithmic systems whilst having the potential to foster innovation and economic development also pose a threat to human rights, in areas such as privacy and data protection, freedom of thought, freedom of expression, the right to equal treatment, and economic and social rights.

 

Biometrics: Evidence shows that as states and business enterprises are developing systems that rely heavily on the collection  and use of biometric data, such as DNA, facial geometry, voice, retina or iris patterns and fingerprints a lot of which are saved on large databases. These databases are cause for huge concern there is the risk of data being stolen and used for purposes it wasn’t originally intended jeopardising the fundamental right to privacy.

 

So what is next? As investment in technology is bound to grow, how do we ensure that investors respect our human rights?

 

Q&A Session with Emrys Schoemaker from Caribou Digital

Emrys Schoemaker, Caribou Digital’s Research Director is leading the implementation, alongside Expectation State, of the USAID-funded INVEST Scoping Digital Markets Investments Towards Bolstering Individual Rights project. The project focuses on identifying the key tensions between investments in digital technology and protecting human rights, and then to recommend interventions that can support investment decisions. We wanted to get Emrys’ opinions on the outcomes of the work we have done so far on the impact that digital technology has had on human rights. 

 

  • What do you see as the biggest tension in regards to investment, growth in digital technology and human rights?

The biggest takeaway is there is no one biggest tension. Technology is global, but the impact is local: for example Facebook is the same for everyone but the impact it has is shaped at the local level.  We therefore need to consider how we approach this:

  1. We develop a specific understanding of technology – rights tensions based on understanding of technology situated in specific contexts and cultures
  2. We develop a dynamic approach which is able to deal with and respond to the local context

 

  • Are there any recent innovations in technology that are particularly concerning to you?

There are three which are of main concern:

  1. Surveillance technologies – facial technologies for example make it very difficult to maintain privacy which is a prerequisite for healthy human rights.  We believe technologies always amplify the power of those who control them – so often, these technologies are exploited by the powerful which amplifies their control over individuals.
  2. Rise of the gig platform – These platforms that intermediate employers and employees, or service providers and service users,  are eroding the rights and protections which individuals are entitled to such as sick and holiday pay. They are also acting as a layer between individuals and institutions, eroding the basic social contract that forms of the basis of society.
  3. Artificial Intelligence – Often these ‘black box’ technologies allow decisions to be made on criteria that we do not understand. Often data collected can also be used for purposes for which it wasn’t originally intended jeopardising the fundamental right to privacy.

 

The golden thread through these technologies and the tensions they introduce is between the value they offer and the erosion of individual agency and our ability to make decisions – the basis of individual freedoms.

 

  • Are there any key lessons from research and interviews that could inform how investors should approach digital technology?

Investors should recognise the importance of government legal frameworks that set requirements for things such as due diligence. Frameworks such as ESG and the SDGs point towards compliance. Other frameworks such as EU financial compliance and disclosure help to understand the impact of investors.

 

  • How do you see our work fitting within the issue?

The work we are doing is really part of shaping the ecosystems. It is really important to understand that the state is a really big player and to also help us direct capital in terms of respecting human welfare. Tech companies need to think about human rights in the context of their business models, and identify the key areas and red flags that raise human rights risks, along with identifying actions to be done.

 

If you are interested in hearing more about this project and our work, please contact Abigail Henderson at abigail.henderson@expectationstate.com