By Ray Walsh
When Ed Bridges was scanned by South Wales police using facial recognition tech, he was shocked at how invasive this new technology could be. Since then, with the help of human rights campaign group Liberty, Bridges has managed to turn UK police use of Automated Facial Recognition (AFR) on its head.'
On Monday, after three years of campaigning, Bridge's efforts resulted in a win for privacy. The court of appeal decided that an initial finding last September by Cardiff high court was wrong. South Wales Police – and other UK forces – have therefore been deploying AFR illegally.
According to the ruling, the police failed in their public sector equality duty by using a technology that is known to result in both racial and sexual discrimination. In addition, the police failed to engage in a data protection impact assessment that fully complied with the Data Protection Act 2018, as a result of which people's data privacy rights were flouted.
There is no doubt that the use of AFR technology under these circumstances is genuinely disturbing, especially when you fully comprehend what it means for innocent members of the general public.
When AFR is deployed in public areas, the police scan every single person. Their justification for this is that they only ever look for a specific watchlist of faces. Everybody else who is scanned, the police claim, is of no interest and is therefore not being put at risk. But the reality is quite different, as the court of appeal has now ruled.
When AFR scans people's faces, it must create a biometric representation of that individual's face. It is this digital map of the face – made up of hundreds of unique measurements – that is then compared to the police's watchlist. This biometric information is highly sensitive, because it can be used to identify an individual time and time again, for the rest of their life.
The dangers posed by collecting this data are very serious. They depend on how it is processed, transported, shared and stored. Without a proper impact assessment, it is impossible to know if the data has been processed securely. As a result, the police have been illegally scanning people's faces without their consent and accumulating that data God-knows-where or for what potential secondary purposes.
Since South Wales police began using AFR in 2015, they have admitted to scanning and collecting the biometric information of 500,000 faces. The vast majority of those people were completely innocent bystanders. AFR technology has also been rolled out by police in Leicestershire and London in public places such as streets, shopping centres, outside football stadiums, and at large events such as the Notting Hill Carnival.
Mission creep in the form of invasive surveillance technologies has been escalating, and the rate with which invasive technologies have been worming their way into British society is alarming.
Government ministers and authorities were warned that this surveillance technology was being rolled out prematurely, if not illegally. In a blog post on the subject, surveillance camera commissioner Tony Porter recently stated that he has been "crying out" for the Surveillance Camera Code of Practice to be updated in his annual report to the home secretary. Porter also appealed to the Home Office in his attempts to update the legal framework that governs the use of AFR. Despite these official recommendations, he was all but ignored.
In the Home Office's 2018 biometrics strategy, there was actually a commitment to update the code. Unfortunately, that commitment never came to fruition, presumably because the government's desire to keep rolling out AFR with impunity was greater.
That process has now been halted. The ruling affects the use of AFR right across the UK, making it illegal for any police force to use the technology until it has been brought before parliament.
Porter argues that there needs to be much greater scrutiny and specificity when creating police watchlists of subjects to be scanned. The locations chosen for AFR to be deployed must be justifiable, with reasonable evidence that watchlist subjects will actually be in those locations at those times.
While these kinds of alterations to the Code will undoubtedly improve the situation, they also leave me with a bad feeling in the pit of my stomach. South Wales police have already indicated that the ruling is something they "can work with" and they do not intend to appeal against it. This is concerning, because it indicates that they believe they will be back to using the technology, legally, in short order.
You may wonder how that is possible, and the answer can be found by looking at our fairly recent history. When Ed Snowden revealed that governments had been working together to perform mass surveillance and bulk data retention, states reacted by creating new surveillance laws to allow those previously secretive and illegal activities to continue.
The danger now is that the government will fast-track new policies that use national security as a justification for people’s rights to be trampled all over. This potential outcome must be opposed at all costs. The UK government must be forcibly encouraged to engage in a comprehensive review of the legislative framework that governs the use of invasive surveillance technologies. The proposals it reaches must be correctly consulted on. Innocent citizens' face data should be designated as highly protected and sensitive biometric info that must be disposed of as soon as it is found to be worthless to the ongoing investigation.
This rare win in favor of privacy must lead to actions that strengthen rather than weaken the legal frameworks.
Ray Walsh is a digital privacy expert at ProPrivacy. You can follow the group on Twitter here.
The opinions in Politics.co.uk's Comment and Analysis section are those of the author and are no reflection of the views of the website or its owners.