Gaze controlled prosthetic arm with EMG and EEG input interface
- Resource Type
- Conference
- Authors
- Kocejko, Tomasz
- Source
- 2017 21st European Microelectronics and Packaging Conference (EMPC) & Exhibition Microelectronics and Packaging Conference (EMPC) & Exhibition, 2017 21st European. :1-9 Sep, 2017
- Subject
- Components, Circuits, Devices and Systems
Gaze tracking
Electroencephalography
Prosthetic limbs
Three-dimensional displays
Electromyography
Robots
eye tracking
prosthetic arm
emg
eeg
hybrid interface
- Language
In this paper the hybrid interface for controlling the prosthetic arm by means of eye tracking and EEG/EMG analysis is shown. The presented interface was designed for people with quadriplegia (caused by neurodegenerative disease or spinal cord injure) who completely lost their mobility. The general purpose of this interface is to partially restore their ability of moving the objects in the surrounding area. The interaction design assumes that such a user looks at a specific item in the environment and thinks about the movement he wants to perform and the prosthetic arm performs certain action. The portable eye tracker was used to determine users point of regards in 3D space while EEG/EMG analysis allowed to send a reach, grab or reach'n'grab command to the processing unit. The data from the eye tracker allowed to position the arm with an accuracy up to 10 cm. The EEG was proved only to confirm certain action while the data acquired by the EMG module allowed for classification of different gestures (up to three gestures) by means of neural networks with an 84% accuracy. The overall interface allow navigation of the prosthetic arm in unobstructed 3D environment.