Brain machine interface device predicts internal speech

Brain machine interface device predicts internal speech

Summary: The new brain-machine interface is the most accurate to date at predicting a person’s inner monologue. Technology can be used to help those with disorders that affect speech to communicate effectively.

source: caltech

New Caltech research shows how devices implanted into people’s brains, called brain-machine interfaces (BMIs), may one day help patients who have lost their ability to speak.

In a new study presented at the 2022 Society for Neuroscience conference in San Diego, researchers show that they can use BMI to accurately predict what words a quadriplegic participant is simply thinking and not speaking or imitating.

Says Sarah Wandlett, a Caltech graduate student in the lab of Richard Anderson, James J. Boswell Professor of Neuroscience and director of the Tianqiao and Chrissy Chen Center for the Brain-Machine Interface at Caltech.

“These new findings are promising in the areas of language and communication. We used BMI to reconstruct speech,” says Wandelt, who presented the results at the conference on November 13.

Previous studies have had some success in predicting participants’ speech by analyzing brain signals recorded from motor areas when a participant whispers or imitates words. But predicting what a person is thinking, the internal dialogue, is more difficult, because it does not involve any movement, Wandelt explains.

“In the past, algorithms that tried to predict internal speech were only able to predict three or four words with low or no accuracy in real time,” says Wandlet.

The new research is the most accurate to date at predicting inner words. In this case, brain signals were recorded from single neurons in a brain region called the supraminal gyrus, located in the posterior parietal cortex. In a previous study, researchers found that this brain region accounts for the spoken words.

Now, the team has expanded its findings to include internal talk. In this study, the researchers initially trained a BMI machine to recognize brain patterns that are produced when certain words are internally uttered, or when thought, by the quadriplegic participant. The training period lasted about 15 minutes.

Previous studies have had some success in predicting participants’ speech by analyzing brain signals recorded from motor areas when a participant whispers or imitates words. The image is in the public domain

Then they flashed a word on the screen and asked the participant to pronounce the word internally. The results showed that BMI algorithms were able to predict eight words with an accuracy of 91 percent.

The work is still preliminary but could help patients with brain injury, paralysis or diseases such as amyotrophic lateral sclerosis (ALS) that affect speech.

Neurological disturbances can result in complete paralysis of voluntary muscles, resulting in patients being unable to speak or move, but still be able to think and reason. For this population, an intrinsic speech BMI would be incredibly useful,” says Wandlett.

“We’ve previously shown that we can decipher imaginary hand shapes for grasping from the human suplaryngeal gyrus,” Andersen says. “The ability to decode speech also from this region suggests that a single implant can restore important human capabilities: comprehension and speech.”

The researchers also noted that BMI cannot be used to read what’s going on in people’s minds. The device would need to be trained in each person’s brain individually, and only worked when the person focused on the word.

Other California Institute of Technology study authors alongside Wandelt and Andersen include David Beganis, Kelsey Pegasa, Brian Lee, and Charles Liu. Lee and Liu are our visiting fellows from Caltech and are faculty members at the Keck School of Medicine at USC.

About this research on neurotechnology news

author: Whitney Clavin
source: caltech
Contact: Whitney Clavin – Caltech
picture: The image is in the public domain

original search: Access closed.
Decoding online internal speech from single neurons in a human participantWritten by Sarah Wandlett et al. MedRxiv

see also

This shows a woman looking at her laptop

Summary

Decoding online internal speech from single neurons in a human participant

Speech-machine interfaces (BMI’s) translate brain signals into words or audio output, enabling communication for people who have lost their ability to speak due to disease or injury.

While important advances have been made in decoding spoken, attempted and simulated speech, the results of internal speech decoding are few, and have not yet achieved high functionality. Notably, it is still not clear from which brain regions internal speech can be decoded.

In this work, a tetraplegic participant performed microelectrode arrays implanted in the supralaryngeal gyrus (SMG) and primary somatosensory cortex (S1), six-word intraocular speech and two false cords.

We found robust intrinsic speech decoding from single-neuron SMG activity, achieving a classification accuracy of 91% during an online task (12.5% ​​chance level).

Evidence has been found for common neural representations between inner speech, word reading, and spoken speech processes. The SMG represented words in different languages ​​(English/Spanish) as well as pseudowords, providing evidence of phonemic coding.

Furthermore, our decoder achieved a high rating using multiple internal speech strategies (audio imagination/visual imagination). Activity in S1 was modulated by vocalization but not internal speech, indicating that no articulatory movements of the vocal tract occurred during internal speech production.

This work represents the first proof-of-concept for an intrinsic BMI for high-performance speech.

Source

Total
0
Shares
Leave a Reply

Your email address will not be published. Required fields are marked *

Prev
Fix Google Pixel error can cause problem for all Android phones
Fix Google Pixel error can cause problem for all Android phones

Fix Google Pixel error can cause problem for all Android phones

A vulnerability affecting “All That Seems” Google pixel phones It is

Next
Google Health Connect beta is here to reduce your health app confusion
Google Health Connect beta is here to reduce your health app confusion

Google Health Connect beta is here to reduce your health app confusion

Google is getting closer to creating a fitness empire as its Health Connect

You May Also Like