Research highlight

Neuroscience: Decoding dialogue from brain activity in real time

Nature Communications

July 31, 2019

A neural decoder that can translate brain activity associated with hearing and responding to questions and produce a text transcript in real time, is presented in Nature Communications.

The cortex of the brain contains distinct areas in which neuronal activity encodes a representation of perceived and produced speech. Studies have shown that this brain activity can be decoded; however, previous research has focused on decoding listening and speaking tasks separately.

Edward Chang and colleagues decoded perceived and produced speech from brain activity in trials mimicking question-and-answer dialogue. The authors recorded cortical activity from the brains of three patients undergoing treatment for epilepsy as they listened to a series of questions and responded verbally with a set of specified answers. These data then served to train the speech detection and decoding models. The participants then listened to a series of questions and responded audibly with an answer of their choice.

Using only the neural signals recorded during this dialogue, the authors were able to detect when the participants were listening or speaking, and predict what was being heard or said. By decoding the question, they were able to use this information to improve the accuracy of the decoded answer (because some answers were valid responses only to certain questions). They found that they were able to decode produced and perceived speech with an accuracy of up to 61% and 76% respectively.

Further work would be needed to decode answers from imagined speech to enable use by individuals who are unable to speak on account of injury or neurodegenerative disorders.

doi: 10.1038/s41467-019-10994-4 | Original article

Research highlights