Abstract
Locked-In Syndrome (LIS) is a condition in which a person is not able to execute voluntary movements. This makes communication with the outside world (almost) impossible, even though communication is found to be a major factor in well-being of persons with LIS. For restoring communication, assistive devices have been developed. A Brain-Computer Interface (BCI) can control assistive devices using brain signals. The Utrecht NeuroProsthesis (UNP) is an invasive BCI developed in the University Medical Center Utrecht. Neural signals from the motor cortex, recorded using electrode strips, allow for controlling spelling software on a tablet computer. However, the possibilities to control the computer (the degrees of freedom) are few: the recorded brain signal is used as a ‘click’ signal that is either on or off. Increasing the degrees of freedom allows for extracting more than a binary signal and therefore faster BCI-based communication and is one of the aims of BCI research. To extend the degrees of freedom, several hand gestures from the sign language alphabet were successfully decoded from the motor cortex. Brain activity patterns were recorded while the participants were instructed to make several hand gestures. A so-called classifier was then used to investigate whether brain activity patterns for different gesture types could be discriminated from one another. The aim of the studies presented in this thesis is to investigate the effects of denervation on the representation of movements, or, in other words, whether it still possible to infer what hand gesture is made by a person who has become unable to execute hand motion. As the prevalence of people with LIS is low, and given that they require special care, people with arm amputation were recruited for the studies described in this thesis. In amputees, the motor cortex is still intact, but motor output is absent. As a measure of ‘intactness’ (integrity) of the hand representation, we designed a study in which amputees were taught six different hand gestures from the American Sign Language Alphabet. While in the MRI scanner, a character that was presented to them on the screen and amputees were instructed make the corresponding hand gestures (or attempt making the gesture with their missing hand). A classifier was successfully trained on fMRI activation patterns in the contralateral sensorimotor area to discriminate between the different gestures, indicating that there is a useful representation of the hand, even years after amputation. There is evidence that not only the contralateral, but also the ipsilateral sensorimotor area plays a role in movement control. Therefore, the classification and decoding of hand gestures was applied to brain activation patterns of the ipsilateral cortex. In addition to discriminating between gestures from the same hand, the classifier was trained on activations of both the ipsilateral and contralateral hand in one data set. This provides an important step forward in studying the feasibility for a BCI to be implanted in one hemisphere and being able to decode both hands, even in people unable to perform movements.
Original language | English |
---|---|
Awarding Institution |
|
Supervisors/Advisors |
|
Award date | 11 Apr 2024 |
Place of Publication | Utrecht |
Publisher | |
Print ISBNs | 978-90-393-7665-2 |
DOIs | |
Publication status | Published - 11 Apr 2024 |
Keywords
- brain-computer interface
- sensorimotor cortex
- amputation
- cortical representation
- functional mri