Implantable brain-computer interface enables completely locked-in patient to communicate

4222
Microelectrode array

Researchers at the Wyss Center for Bio and Neuroengineering (Geneva, Switzerland), in collaboration with the University of Tübingen (Tübingen, Germany), have enabled a person with complete paralysis, who cannot speak, to communicate via an implanted brain-computer interface (BCI).

This breakthrough came as part of a clinical case study that has been ongoing for more than two years with the participant—who has advanced amyotrophic lateral sclerosis (ALS)—and the results show that communication is possible with people who are completely ‘locked-in’ because of ALS, according to a Wyss Center press release. Details of the study are published in Nature Communications.

“This study answers a long-standing question about whether people with complete locked-in syndrome, who have lost all voluntary muscle control including movement of the eyes or mouth, also lose the ability of their brain to generate commands for communication,” said Wyss Center senior neuroscientist Jonas Zimmermann. “Successful communication has previously been demonstrated with BCIs in individuals with paralysis. But, to our knowledge, ours is the first study to achieve communication by someone who has no remaining voluntary movement and, hence, for whom the BCI is now the sole means of communication.”

The study participant is a man in his thirties who has been diagnosed with a fast-progressing form of ALS. He has two intracortical microelectrode arrays surgically implanted in his motor cortex. The release details that two microelectrode arrays, each 3.2mm2, were inserted into the surface of his motor cortex. Each array has 64 needle-like electrodes that record neural signals.

The participant, who lives at home with his family, has learned to generate brain activity by attempting different movements. These brain signals are picked up by the implanted microelectrodes and are decoded by a machine learning model in real-time. The model then maps the signals to mean either ‘yes’ or ‘no’. And, to reveal what the participant wants to communicate, a speller programme reads the letters of the alphabet aloud. Using auditory neurofeedback, the participant is able to choose ‘yes’ or ‘no’ to confirm or reject the letter, ultimately forming whole words and sentences.

“This study has also demonstrated that, with the involvement of family or caregivers, the system can, in principle, be used at home,” said Wyss Center chief technology officer George Kouvas. “This is an important step for people living with ALS who are being cared for outside the hospital environment. This technology, benefiting a patient and his family in their own environment, is a great example of how technological advances in the BCI field can be translated to create direct impact.”


LEAVE A REPLY

Please enter your comment!
Please enter your name here