Algorithms are created to maximise certain interests – and it’s not ours, says Chris Jones from Statewatch

December 3, 2020
Share on facebook
Share on linkedin
Share on twitter

As part of an international campaign to lift the lid on data privacy violations, The Privacy Collective is asking some of Europe’s leading experts why online privacy matters. 

Chris Jones is the Executive Director of Statewatch, a non-profit organisation that monitors state and civil liberties in Europe. Here, he discusses how personal information is being exploited for profit by big technology companies, why state surveillance operates in a similar way to online advertising systems, and steps people can take to reclaim their own data.

Chris-statewatch

Why does online privacy matter?  

When we’re browsing the internet on our phones or computers, we’re not doing it unseen. Many of the apps and websites we visit online are full of tracking technologies that collect data about us. That’s then being used to build profiles to try and direct our interests. These models are used by very powerful companies that try to manipulate behaviour for their own purposes.

Can you tell me a bit about the work you’re doing at Statewatch, particularly around your specialisms of privacy, data protection and security technologies? 

We focus a lot on information gathering undertaken by the police or border control agencies. This year, we’ve done a lot of work on various systems for the surveillance and profiling of travellers entering the EU for holidays or for business. There are new technologies that will be used to perform risk analysis on travellers based on various factors such as nationality, age, gender, profession, etc, , in combination with other information, to determine whether they they constitute an illegal immigration risk, or some sort of security risk. 

It’s a different world of data gathering in some respects to online advertising systems but it works in the same way, in that you take various data points about an individual, use that to build a profile on them, and then determine something about their personality, their interests or what they represent. 

How aware do you think the British public are about how their data is being used – at a commercial level? 

I think people are aware of it. If you’ve just searched for something – holidays, or shoes, for example – and an advert then pops up on your phone, that kind of thing is quite clear. But I’m not sure people are aware of the extent to which there is a market for the information about what they do on the internet and the number of companies that are involved. If you go onto any major website and actually look at the list of companies that have placed cookies on that site, it’s enormous. 

The amount of data collected about individuals is quite impressive. Arguably the people who work in the industry would say, ‘well if we can track more people’s interactions, we can understand you even better and optimise your user experience’. That might sound great, except what that really means is they have an even more intimate idea of who you are, what your interests are, and can use that to try to manipulate you into buying more stuff you don’t want or need.

It’s possible to mine human  behaviours and interests and ideas and emotions in order to sell things – both to turn them into commodities and to sell commodities back to them. This is being used to transform people’s behaviour, turning them into the corporate citizens that big technology companies like Google, or Amazon want them to be. Essentially, it’s about the exploitation of personal data for profit.

You shared this interesting piece on Twitter a couple of weeks ago – would you mind if we talked a bit about what “surveillance capitalism” and what could be considered “algorithms of oppression”?

Essentially, capitalism as an economic system relies on finding new raw material to extract and turn into commodities to sell for further investment. What Shoshana Zuboff argues in her book, Surveillance Capitalism, is that we have reached a stage where modern technology makes it possible to do that with people’s personalities. It’s possible to mine human  behaviours and interests and ideas and emotions in order to sell things – both to turn them into commodities and to sell commodities back to them. This is being used to transform people’s behaviour, turning them into the corporate citizens that big technology companies like Google, or Amazon want them to be. Essentially, it’s about the exploitation of personal data for profit.

Algorithms of oppression are those that are used to inhibit people’s freedom in a certain way, such as inhibiting your ability to find certain information. Although you get the impression of limitless choice – Google presents you with 2 billion search results, for example  – who ever looks at the second page? It’s been said that an algorithm is an opinion written in code, but it’s more often presented to us as something scientific. Algorithms are designed precisely to maximise certain interests and it’s those interests of the companies producing the goods that are being sold to you. There are other ways to organise information and data, which have more beneficial ideas at their heart, rather than just making someone else rich. 

You wrote a piece at the beginning of the year about the EU’s proposed facial recognition databases – what are the implications if this goes ahead? Should the public be concerned about this sort of surveillance?

Absolutely. In the UK the police have already started using live facial recognition technology, but it’s in a more limited sense to what the European Union is considering. The EU intends to make it mandatory for every police force to have a facial recognition database, and connect them all up together so they can be mutually searched. They might also be connected to driving license databases, passport databases, etc. There’s a question of political philosophy here, because essentially if you’re walking down the street, and you’re being observed by a camera that can recognise who you are by scanning the driving license database, you’re permanently a suspect. That’s not what a liberal democracy is supposed to be about. 

The argument the police put forward is that this will be used in a very limited way – just for those they arrest, or to detect people from crime scene footage. That may be true at first, but there’s not a single state database in history that hasn’t started in one way and expanded to take on other purposes as it grows. If we’re supposed to be living in liberal democracies, we shouldn’t be discussing implementing these kinds of measures. But this is now on the agenda of the European Union to be proposed as legislation in the next year or so. 

Have you been concerned about some of the surveillance measures rolled out during the pandemic? Is data and technology being used to get us out of lockdown? 

Data should be used to try and get us out of lockdown. The question is whether those in charge are capable of using the information that’s available to them. Obviously we live in a society that’s obsessed with technology, so it’s no surprise that when the pandemic showed up, people thought that technology would be the answer. The UK track and trace app story is obviously a total debacle – the government proposed one, they scrapped it. Now, there’s another one, which maybe works, maybe it doesn’t. But for certain parts of the surveillance industry, the pandemic has been a big boost. Companies that sell contactless biometrics, like facial recognition, and contactless fingerprint scanning are all trying to position themselves as being pandemic friendly, because they don’t require physical contact. Like any crisis, it’s a great opportunity to introduce previously unprecedented measures. 

There are other examples of automated analytics and facial recognition being used to determine if people are social distancing – for example in France. Russia has employed facial recognition to see if people who are supposed to be isolating are out in the streets or not. I would say, the British government probably hasn’t got as far as some others in that respect. But the idea that we can solve the pandemic by using a smartphone app is absurd, because not everyone has a smartphone. It assumes that the normal person is wealthy and tech savvy enough to have an up to date, relatively new smartphone, and to have it on all the time.

What can people do to educate themselves and protect their online data today? 

There are a lot of browser add ons, you can get for your computer or phone that will help protect your privacy. They block tracking cookies and adverts, and ensure that the connections between your computer and the websites you visit are encrypted by default. That just takes a few minutes. Then there are lots of organisations working on this – obviously the Privacy Collective, there’s Privacy International, Statewatch, Amnesty, Open Rights Group, Big Brother Watch, European Digital Rights. Follow the work of those groups, support them, and get interested in what they do. 

And then with the rights granted under the GDPR, if you really want to know what these companies know about you, you can write to a big data broker and make a subject access request. The company is legally obliged to reply. That’s probably going to be more of an education than anything else. 

 

Your data should not be for sale. We’re taking Oracle and Salesforce to court for the misuse of millions of peoples data and we need your help! If you believe that tech giants should be held accountable for their use of people’s data please support our claim here. Because your privacy matters. 

RELATED ARTICLES