The tricky ethics of neurotechnologies – Axios

As the science of brain-computer interfaces (BCI) and other neurotechnologies progresses, researchers are calling for ethical guidelines to be established now before the technology fully matures.

Why it matters: Were still far away from technologies that fully access and even read the human brain, but the sheer power of such tools and the highly personal data they could gather means society needs to determine what they should do before they actually can do it.

Whats happening: Columbia Universitys NeuroRights Initiative held a symposium today in conjunction with IBM on the scientific, security and social issues raised by neurotech.

The big picture: In the future, BCIs could provide an unprecedented view of the human brain at work, which in turn could unlock new clinical insights into largely untreatable mental and neurological diseases, as well as change how humans interface with the world.

What theyre saying: The ethical issues raised by that power were the focus of IBM director of research Daro Gils symposium remarks, which touched on first-generation ethical principles for neurotech developed by the company.

Details: Many of the ethical issues created by BCI questions of transparency and fairness resemble those raised by AI or even social media, only intensified.

To that end, Gil says IBM is committed to respecting mental privacy and autonomy, being transparent in its neurotech work, and ensuring that people have an equal opportunity to choose whether or not they want to use the technology.

The catch: Scientific codes of ethics may not mean that much to notoriously independent players like Elon Musk, who has made promises about the potential for the BCI technology developed by his company Neuralink to eventually allow AI symbiosis, as he said at an event in August.

The bottom line: BCI could be a revolution for humanity, as Yuste put it. But revolutions have a way of getting out of hand.

Link:
The tricky ethics of neurotechnologies - Axios

Related Posts

Comments are closed.