Last June, when the US Supreme Court overturned Roe vs Wade, women started deleting period-tracking apps from their phones. With abortion set to become illegal in certain states, they feared the data might be used to prosecute those seeking terminations. Was this an overreaction, or simply the realisation that we have entered the age of Orwellian electronics?
Many of us have a vague, creeping feeling that our devices might work against us. In 2016 a man was accused of burning down his home in Ohio after his heart pacemaker shed doubt on his insurance claim of an accidental fire. Two years later, US military personnel were found to be inadvertently divulging secret army base locations via their Strava fitness apps.
Such stories, however, have not deterred us from giving our data away. We blithely allow apps to access our location, call records and other information because we are impatient to get to whatever fleeting new function we desire. But we need to wise up, because a new challenge is coming: how to protect our brain data.
Investment is pouring into “neurotechnology”, which can record and analyse electrical impulses from the nervous system. Brain-computer interfaces offer tremendous benefits, such as aiding the recovery of stroke patients and reducing episodes of epilepsy. EEG headsets, which track brainwaves, are used by gamers to control on-screen characters. They can also detect when a truck driver is losing focus. More than 5,000 companies in mining, construction and other industries from Australia to South Africa are using this technology to make sure their employees are alert and awake.
Here’s where I get squeamish. In making drivers wear such headsets, companies may be saving lives. But it also feels like a potential tool of oppression — one that could easily be applied to any other employee whose boss would like to know when their mind is wandering.
The neuroscientist Nita Farahany, professor of law and philosophy at Duke University, agrees. In a new book, The Battle for Your Brain: defending the right to think freely in the age of neurotechnology, she predicts a world in which AI and neuroscience combine to invade our mental privacy. There are fears that the Chinese Communist party could use AI to analyse facial expressions and brain signals to judge the loyalty of party members. The emerging ability to track and decode what goes on in the human brain requires a serious conversation about how we use it.
Farahany believes that neurotech is set to become a “universal controller” of all our interactions with technology. A New York start-up called CTRL-Labs, which has been bought by Meta, has developed a neural wristband that lets the wearer operate a computer first by tiny movements of the fingers, then by detecting the intention to move. Next Sense, created by Alphabet, is making earbuds which can detect neural data.
This is not yet mind-reading. The brain’s electrical impulses are not the same thing as thoughts, and brain-computer interfaces can’t act as lie detectors. Farahany warns, however, that “algorithms are getting better at translating brain activity into what we are feeling, seeing, imagining or thinking”.
There are some echoes of this in a new paper from Chatham House. It argues that while AI offers tremendous benefits, its risks include “the erosion of individual freedoms through ubiquitous surveillance; and the replacement of independent thought and judgement with automated control”. The report also says that few of the many AI strategies and governance principles being developed even mention human rights. Yet the right to freedom of thought includes the right to keep our thoughts private, and not be penalised for them.
The debate about how to regulate neurotechnology is in its infancy. But a number of scientists are pushing for “neuro-rights”. The neurobiologist Rafael Yuste is arguing for a right to mental privacy, “so that the content of our mental activity is not decoded without our consent”. Chile recently became the first country in the world to insert neuro-rights into its constitution, and will soon legislate to regulate technologies which record or alter brain activity.
Given its recent memories of authoritarianism, Santiago may be more alive to the risks than Washington or London. Older democracies tend to have the same conversation about every new technology. Experts extol the benefits; investors pile in; and ethical considerations are left to ponderous committees and governments which can’t keep pace.
The consequences of being so passive for so long about social media have left our societies grappling with deeply disturbing behaviours, from obsessions with body image to radicalisation, suicide and extreme pornography. We must not make the same mistake with brain data. Technologies may be neutral, but humankind certainly isn’t. This is not something I often say, but in the case of neurotech: bring on the lawyers.