Abstract
People communicate through lots of different ways, for example by moving the hands or the mouth. However, for people who are paralyzed it is much more complicated to communicate. People suffering from Locked-In Syndrome (LIS), for example, can still think and feel but are paralyzed and can therefore not express themselves. New technologies have been developed over the recent years to help these patients to communicate. A brain-computer interface (BCI) system, for example, can record brain signals and translate these into written text or synthesized speech. This way, people with paralysis can still communicate by thinking of for example a sound or a word. However, these systems are still not yet good enough for patients to use them at home since the interpretation of the brain signals is very difficult. The brain signals that are related to the production of one simple sound may, for example, be influenced by how it is pronounced or by which sound precedes. It remains to be determined which factors affect the brain signals. In this thesis I show that duration and velocity of sound production as well as transitions between sounds influence the brain signals. One sound is accompanied with multiple brain activity patterns instead of just one. In addition, I show that also facial expressions can be used for communication, for example for the expression of emotions, and that this is possible even by recording from a small part of the brain.
Original language | English |
---|---|
Awarding Institution |
|
Supervisors/Advisors |
|
Award date | 1 Apr 2019 |
Place of Publication | [Utrecht] |
Publisher | |
Print ISBNs | 978-90-393-7106-0 |
Publication status | Published - 1 Apr 2019 |
Keywords
- BCI (Brain-computer interface)
- speech
- ECoG (electrocorticography)
- sensorimotor cortex
- phonemes
- coarticulation
- facial expressions
- emotions
- articulators
- tongue