Abstract
Implantable brain computer interfaces (BCIs) promise to re-establish communication for severely paralyzed people. This thesis argues that decoding the acts of language production is a promising strategy for BCI control. Using high-field functional magnetic resonance imaging (fMRI) and high-density electrocorticography (ECoG) we studied the topographic representation of different acts of
... read more
language production and the robustness of their activity patterns, to test the hypothesis of fine-grained topography. Given that the brain surface is covered with blood vessels, we first investigated the importance of positioning ECoG electrodes on brain tissue, by evaluating effects of bloodvessels on signal quality. We showed that the signal recorded by electrodes on top of blood vessels has a different frequency content compared to the signal recorded by electrodes that are in direct contact with the cortical surface. This is primarily the case for higher frequencies, which are most informative for BCI control. The absolute differences in power are only small; however, this becomes important in the context of implantable BCI systems where the absolute signal strength is a relevant factor, for example for the necessary pre-amplification of the signal before it can be transmitted. We subsequently tested the hypothesis that hand gestures from sign language can be discriminated based on their representation on the sensorimotor cortex on a single trial basis, by using fMRI and a 7 Tesla MRI scanner. Four complex hand gestures could be classified with an accuracy of 63% (range 35- 95%; chance level 25%). A small patch of cortex, around the hand knob area, was sufficient to make this discrimination. The classification accuracy varied considerably between participants, and appeared to be related to the consistency with which the gestures were executed. We tested the same hypothesis with implanted electrode grids in five epilepsy patients (implanted for diagnostic reasons). High-density electrode grids were implanted on the sensorimotor cortex. From the five patients, two had adequate hand-knob coverage for analysis. We showed that hand gestures could also be discriminated using ECoG signals. Four different hand gestures were classified with 97% in one participant and with 73% in the second participant. In the final phase we tested whether we could decode speech articulators, and showed that articulator movements can be discriminated based on their representation on the sensorimotor cortex on a single trial basis using 7 Tesla fMRI with an accuracy of 89%. A small patch of cortex, the ventral half of the lateral sensorimotor cortex, was sufficient to make these discriminations, indicating that the articulator movements could also be distinguished using surface electrodes. This research shows that language related movements can be decoded based on the brain activity of a small patch of cortex. Further testing in paralyzed people is required to investigate potential applications of this work in BCIs.
show less