Mind over machine: UN urges ethical guardrails for brain tech revolution
As wearable devices begin to tap into our mental states, UN experts warn that without ethical safeguards, the right to freedom of thought could be the latest casualty of unchecked innovation.
As wearable devices begin to tap into our mental states, UN experts warn that without ethical safeguards, the right to freedom of thought could be the latest casualty of unchecked innovation.
It seems like science fiction, or even magic: the ability to communicate, control a computer or move a robotic limb via the power of thought.
However, it’s not only possible, it’s already transforming the lives of patients with severe disabilities.
In 2024, an audience at a UN conference in Geneva sat astounded as a young man in Portugal with “locked in syndrome” – a neurological disorder that left him unable to move any part of his body – was able to “speak” to them, using a brain-computer interface (BCI) that translated his thoughts into words, spoken in his voice, and answer their questions.
This is a striking example of the growing field of neurotechnology, which holds out great hope for those living with disabilities and mental disorder such as Parkinson’s disease, epilepsy and treatment-resistant depression.
Mental privacy: A lost battle?
But while the use of neurotechnology for the medical sector is strictly regulated, its use in other areas is raising concerns.
Products such as headbands, watches and ear pods that monitor heart rate, sleeping patterns and other health indicators are increasingly popular. The data they collect can provide deep insights into our private thoughts, reactions and emotions, improving quality of life.
This poses ethical and human rights challenges, because manufacturers are currently free to sell or pass it on without restriction. Individuals face the possibility of having their most intimate mental privacy intruded upon, their thoughts exposed, monetised and even controlled.
“It’s about freedom of thought, agency and mental privacy,” says Dafna Feinholz, acting head of Research, Ethics and Inclusion at UNESCO.
She worries that the battle for mental privacy is being lost in an age of social media, with users willingly uploading their private lives to platforms owned by a handful of giant tech companies.
“People say ‘I have nothing to hide,’ but they don’t understand what they’re giving away,” she adds.
“We are already being profiled by AI, but now there is this possibility of entering thoughts, directly measuring the activity of the brain and inferring mental states. These technologies could even modify the structure of your nervous systems, allowing you to be manipulated. People need to know that these tools are safe and that, if they wish, they can stop using them.”
The UN official insists that, while we have to accept that we need to live with technology, we can ensure that humans remain in charge.
“The more we surrender to the power and superiority of these tools, the more we are going to be taken over. We need to control what they do and what we want them to achieve, because we are the ones who are producing them. This is our responsibility for all the technology we create.”
Time for an ethical approach
Ms. Feinholz spoke to UN News from the ancient Uzbek city of Samarkand where, on Wednesday, delegates from the Member States of UNESCO – the UN agency for education, science and culture – formally adopted a “Recommendation” (non-binding guidance on principles and best practices than can form the basis of national policies) on the ethics of neurotechnology, with an emphasis on the protection of human dignity, rights, and freedoms.
The guidance advocates for the promotion of well-being and an avoidance of harm associated with the technology, freedom of thought (ensuring that individuals retain control over their mind and body) and for developers, researcher and users to uphold ethical standards and be accountable for their actions.
Member States are being advised to put several measures in place, including implementing legal and ethical frameworks to monitor the use of neurotechnology, protect personal data and assess the impact on human rights and privacy.
“Humans have to be in the loop,” declares Ms. Feinholz. “There needs to be transparency, redress and compensation, as there is in other sectors. Take restaurants as an example. If you eat out you don’t have to know how to cook. But if you order a spaghetti carbonara and it makes you sick, you can complain to the owner. There is accountability. The same should apply to neurotechnology: even if you don’t understand how it works, there has to be chain of accountability.”
© UN News (2025) — All Rights Reserved. Original source: UN News
Where next?
Browse related news topics:
Read the latest news stories:
- Can workers compete with machines and stay relevant in the AI era? Saturday, January 31, 2026
- U.S. Exit from Paris Agreement Deepens Climate Vulnerability for the Rest of the World Friday, January 30, 2026
- Business Growth and Innovation Can Boost India’s Productivity Friday, January 30, 2026
- The UN is Being Undermined by the Law of the Jungle Friday, January 30, 2026
- UN warns Myanmar crisis deepens five years after coup, as military ballot entrenches repression Friday, January 30, 2026
- South Sudan: ‘All the conditions for a human catastrophe are present’ Friday, January 30, 2026
- World News in Brief: Syria ceasefire welcomed, ‘Olympic truce’, Ukraine’s freezing children Friday, January 30, 2026
- UN watchdog warns Ukraine war remains world’s biggest threat to nuclear safety Friday, January 30, 2026
- Reaching a child in Darfur is ‘hard-won and fragile’, says UNICEF Friday, January 30, 2026
- ‘Unfathomable But Avoidable’ Suffering in Gaza Hospitals, Says Volunteer Nurse Thursday, January 29, 2026
Learn more about the related issues: