On Wednesday, the government released an “independent” report on deregulation, crafted by senior Conservative MPs, in consultation largely with business lobby groups. Worryingly, George Freeman, the former chair of Theresa May’s Policy Board, is one of the authors, alongside Iain Duncan Smith and Theresa Villiers.
The report appears to signal the kinds of changes the government would like to see in a post-EU world, particularly around personal data and your health data, now that the UK can write different rules. And it is a scary world, if you worry about who’s deciding things about you, and how your life chances are granted to you.
The report pushes the idea that current data privacy rules are “burdensome” and inconvenient for business. It flags the idea that “cookie banners” are basically all that GDPR is: a “tick box” exercise in fake consent. The solution, then, is to remove the need for consent. Hey presto, no cookie banners! But also: goodbye any chance that we will rein in the appalling adtech industry, which is busy profiling you and amassing data about you.
The report proposes that data, once collected, should be available for new, innovative purposes. To take the cookie wall example. The report’s proposal would mean that the list of websites you visit and collected for advertising (takeaways, alcoholic drinks, reminding yourself what Syphilis entails) could then be used when calculating possible health problems or insurance premiums. This is not fantasy: this is what data practice in the USA already entails.
This is all banned under current GDPR rules, because data can only be used for the purposes stated. Data cannot be legally traded and repurposed by default. But the government wants economic growth, so is looking for ways to ensure that data can be used more widely. Thus the “independent report” thinks that rules restricting the purposes for which data is used should be scrapped.
The report has the Artificial Intelligence industry firmly in mind. It goes on to explain that the AI industry is hindered by restrictions on automated decisions. In current law, if your exam result is calculated by a computer (just for example) then you have the right in GDPR to appeal against the decision, because it significantly affects you, and is an automated decision.
Where this will lead is to opportunities to discriminate with little risk of punishment: to surreptitiously exclude women, the elderly or racial minorities from job interviews; or to decline people with the wrong kind of English from driving insurance (yes that has been tried). If, as the report concludes, restrictions on automated decisions need to be scrapped, these become viable practices, out of the sight of those affected, and away from the remit of the regulator.
Precisely the same concerns apply to government which, after all, endorsed exam results by algorithm, and is now seeking to trawl government data to identify potential criminals and profile people as potential violent criminals.
The real problem with data protection in the UK is that we have a regulator that has repeatedly failed to clamp down on the most serious abuses; especially when these involve important UK businesses, such as unlawful adtech, or government. Examples include the exam algorithm debacle and a whole raft of other unlawful actions by the government during the pandemic, including failing to do Test and Trace data audits, claiming these were mere “bureaucracy”.
With little risk of being punished — except for the most obvious failures such as data breaches or unsolicited communications — and not much chance of being called out in public, there is little incentive to do the right thing. Our rights are indeed being undermined by an ineffective ICO, but this is not a reason to abolish them.
The opposition and backbench MPs desperately need to get ahead of this. The dystopian vision promoted by Freeman, Smith and Villiers is shadowing proposals that will be solidified in the coming months, in the National Data Strategy. Government has already signalled it wants to appoint an Information Commissioner who believes in this kind of approach. And in each of its new proposals using personal data, it has failed to outline privacy safeguards.
The way we allow personal data to be used in law is directly related to discriminatory practices and poor outcomes for people whose data is used to identify them as risky. Sometimes this is reasonable, but often it is not. GDPR represents a great deal of careful thought about how privacy and data usage can be done safely and transparently. Pulling all that up in the name of business profit is likely to result in a world where we do not count as people, but merely as good or bad prospects for business.