Neurolinguistic and machine-learning perspectives on direct speech BCIs for restoration of naturalistic communication

The ultimate goal of brain-computer-interface (BCI) research on speech restoration is to develop devices which will be able to reconstruct spontaneous, naturally spoken language from the underlying neuronal signals. From this it follows that thorough understanding of brain activity and its functiona...

Full description

Saved in:
Bibliographic Details
Published inBrain computer interfaces (Abingdon, England) Vol. 4; no. 3; pp. 186 - 199
Main Authors Iljina, Olga, Derix, Johanna, Schirrmeister, Robin Tibor, Schulze-Bonhage, Andreas, Auer, Peter, Aertsen, Ad, Ball, Tonio
Format Journal Article
LanguageEnglish
Published Taylor & Francis 03.07.2017
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The ultimate goal of brain-computer-interface (BCI) research on speech restoration is to develop devices which will be able to reconstruct spontaneous, naturally spoken language from the underlying neuronal signals. From this it follows that thorough understanding of brain activity and its functional dynamics during real-world speech will be required. Here, we review current developments in intracranial neurolinguistic and BCI research on speech production under increasingly naturalistic conditions. With an example of neurolinguistic data from our ongoing research, we illustrate the plausibility of neurolinguistic investigations in non-experimental, out-of-the-lab conditions of speech production. We argue that interdisciplinary endeavors at the interface of neuroscience and linguistics can provide valuable insight into the functional significance of speech-related neuronal data. Finally, we anticipate that work with neurolinguistic corpora composed of real-world language samples and simultaneous neuronal recordings, together with machine-learning methodology accounting for the specifics of the neurolinguistic material, will improve the functionality of speech BCIs.
ISSN:2326-263X
2326-2621
DOI:10.1080/2326263X.2017.1330611