Sahana Sitaraman in The Scientist:
There’s a voice inside most people’s minds that comes alive when they listen, read, or prepare to speak. This “internal monologue” is thought to support complex cognitive processes like working memory, logical reasoning, and motivation.1 In fact, inner speech continues to thrive in many individuals who are unable to speak owing to injury or disease.2 More than half a century ago, Jacques Vidal, a computer scientist at the University of California, Los Angeles, proposed the idea for brain-computer interfaces (BCIs); systems that could use electrical signals in the brain to control prosthetic devices.3
Since then, scientists have designed and developed BCIs that have enabled people with quadriplegia to control a computer cursor, a robotic arm, and even move their own limb. Recently, a person with amyotrophic lateral sclerosis (ALS)—a neurodegenerative disease—who had severe difficulties speaking, was able to carry out a freeform conversation with the help of a speech BCI.4 The neuroprosthesis accurately translated brain activity into coherent sentences while the person tried to speak to the best of their ability. However, the reliance on attempted speech can fatigue the user and limit communication speed.
More here.
Enjoying the content on 3QD? Help keep us going by donating now.
