TY - GEN
T1 - Human-computer-interface for controlling the assistive technology device
AU - Rahma, Osmalina Nur
AU - Kurniawati, Maydiana Nurul
AU - Rahmatillah, Akif
AU - Ain, Khusnul
N1 - Publisher Copyright:
© 2020 Author(s).
PY - 2020/12/9
Y1 - 2020/12/9
N2 - Imagining a motion without doing the actual movement is known as Motor imaginary (MI). However, translating MI as an input for BCI that control the Assistive Technology (AT) device was a challenging method due to the extensive training required, poor user engagement, and visual feedback response delays. For a person who has difficulties to imagine without visual stimulation, more training should be conducted. Even with a visual stimulus, sometimes a person still finds it hard to distinguish between one movement and another. So, finding a suitable input to control an AT device that suitable for someone who has extremely limited movement is very challenging. In the other hand, EEG signal had been proven sufficient in detecting facial expression. Even though the EEG signal contains EMG facial artifact and considered as an unwanted signal for BCI, facial movement artifact could become a compensator signal for EEG based Human Computer Interface (HCI) system to improve efficiency. This study aims to develop HCI systems to distinguish three different movements (forward, backward, and stop) EEG signals that recorded from EEG EMOTIV Epoc based on different facial expression. This HCI system could bring benefit if it is implemented for AT device, such as rollator since the user could easily distinguish different movement without any visual stimuli required so could reduce an extensive training before usage and could improve the quality of life people with limited extremity movement.
AB - Imagining a motion without doing the actual movement is known as Motor imaginary (MI). However, translating MI as an input for BCI that control the Assistive Technology (AT) device was a challenging method due to the extensive training required, poor user engagement, and visual feedback response delays. For a person who has difficulties to imagine without visual stimulation, more training should be conducted. Even with a visual stimulus, sometimes a person still finds it hard to distinguish between one movement and another. So, finding a suitable input to control an AT device that suitable for someone who has extremely limited movement is very challenging. In the other hand, EEG signal had been proven sufficient in detecting facial expression. Even though the EEG signal contains EMG facial artifact and considered as an unwanted signal for BCI, facial movement artifact could become a compensator signal for EEG based Human Computer Interface (HCI) system to improve efficiency. This study aims to develop HCI systems to distinguish three different movements (forward, backward, and stop) EEG signals that recorded from EEG EMOTIV Epoc based on different facial expression. This HCI system could bring benefit if it is implemented for AT device, such as rollator since the user could easily distinguish different movement without any visual stimuli required so could reduce an extensive training before usage and could improve the quality of life people with limited extremity movement.
UR - http://www.scopus.com/inward/record.url?scp=85097977555&partnerID=8YFLogxK
U2 - 10.1063/5.0034256
DO - 10.1063/5.0034256
M3 - Conference contribution
AN - SCOPUS:85097977555
T3 - AIP Conference Proceedings
BT - 2nd International Conference on Physical Instrumentation and Advanced Materials 2019
A2 - Trilaksana, Herri
A2 - Harun, Sulaiman Wadi
A2 - Shearer, Cameron
A2 - Yasin, Moh
PB - American Institute of Physics Inc.
T2 - 2nd International Conference on Physical Instrumentation and Advanced Materials, ICPIAM 2019
Y2 - 22 October 2019
ER -