News

The UK’s online safety bill will not keep us safe

The writer is founder of Sifted, an FT-backed site about European start-ups

British politicians seemingly love nothing more than to swap confidential information, gossip, lobby and conspire on WhatsApp, unless, of course, their messages leak. Then, those selfsame politicians, such as Matt Hancock — the former health secretary who inadvisedly shared thousands of WhatsApp messages with a shouty journalist — profess outrage that their secrets have been exposed and their privacy betrayed. Like 2bn other users who rely on WhatsApp, they want a fast, free and safe messaging service. Not unreasonably, they expect both privacy and security.

It is therefore ironic that Britain’s politicians are close to passing legislation that will either significantly weaken the security of those messaging apps or most likely prompt the operating companies to switch them off in the UK altogether, making the country a strange kind of cyber pariah. The heavily rejigged, redrafted and renamed Online Safety Bill, reflecting the shifting ideological preferences of three prime ministers and four years of parliamentary wrangling, is now being debated in the House of Lords and is likely to be enacted this summer. But it remains a flawed piece of legislation that supposes outcomes that cannot realistically be achieved. Even at this late stage, the bill should be withdrawn and reconsidered.

The intent behind the legislation is understandable. Appalled by recent cases of online abuse, the government wants to prevent paedophiles, terrorists and criminals from sheltering behind the end-to-end encryption offered by messaging services such as WhatsApp, Signal and Element. The government wants such apps to filter out and flag illegal content and will fine them up to 10 per cent of annual global turnover for failure to comply. Earlier this month, the Virtual Global Taskforce, an international alliance of 15 law enforcement agencies including Britain’s National Crime Agency, highlighted how any extension of end-to-end encryption services could have a “devastating impact” on the ability to identify, pursue and prosecute offenders.

The messaging apps vigorously defend end-to-end encryption as the best way of protecting billions of users from online fraud and data theft and malicious hacking by hostile governments and organised crime. An open letter, posted by seven private communication services earlier this month, claimed that the UK bill would open the door to “routine, general and indiscriminate surveillance” that would undermine all users’ ability to communicate securely. “There cannot be a ‘British internet,’ or a version of end-to-end encryption that is specific to the UK,” it read.

Meredith Whittaker, president of the foundation that oversees the Signal app, accuses the British government of “cynically marshalling” the emotive issue of child abuse to push through sweeping powers of surveillance. Demanding that messaging services filter all traffic is “an extraordinarily expensive, unworkable system that exists right now in the realm of fantasy”. “Signal would shut down entirely before we undermined the privacy promises we made to our people,” she tells me.

The deeply entrenched arguments on both sides of the debate have prompted some security experts to seek a compromise. One such is Andersen Cheng, chief executive of the cyber security company Post-Quantum, who shut down his own early-stage PQ Chat messaging app in 2015 after discovering that the Isis terrorist group was recommending its use.

Cheng now argues for encryption key splitting, which would preserve the benefits of end-to-end encryption while allowing limited access by law enforcement agencies in specific circumstances. This is similar to the police having to obtain a search warrant from a judge before raiding a suspect’s house. Multiple “fragment guardians,” including tech companies, law enforcement agencies, courts and civil rights groups, would hold parts of a golden cryptographic key that would only work when approved by all. Even the existence of such a system would be a deterrent, Cheng tells me. “Illegal activity will disappear overnight.” 

Ingenious though it is, this solution currently satisfies neither side. Law enforcement agencies want broader powers to force tech companies to filter illegal content pre-emptively. Tech companies argue that any backdoor endangers the security of all. Besides, who could users trust to be fragment guardians in authoritarian states?

That said, it is surely better to explore such a compromise than pass a law that will not work.

Articles You May Like

Big bets on AI point to venture capital industry’s shift
Ukraine’s waiting game for Trump
US Congress races to avert government shutdown before weekend
Bitcoin tops $107K for first time as Trump boosts hopes for strategic reserve: ‘Blue sky territory here’
UK wage growth accelerates to 5.2%