Machine Learning Fusion Model Approach for the Real-Time Detection of Head Gestures using IMUs
- Resource Type
- Conference
- Authors
- Davila-Montero, Sylmarie; Mason, Andrew J.
- Source
- SoutheastCon 2024 SoutheastCon, 2024. :909-913 Mar, 2024
- Subject
- Aerospace
Bioengineering
Communication, Networking and Broadcast Technologies
Components, Circuits, Devices and Systems
Computing and Processing
Engineered Materials, Dielectrics and Plasmas
Engineering Profession
Fields, Waves and Electromagnetics
General Topics for Engineers
Geoscience
Nuclear Engineering
Photonics and Electrooptics
Power, Energy and Industry Applications
Robotics and Control Systems
Signal Processing and Analysis
Transportation
Time-frequency analysis
Machine learning
Feature extraction
Magnetic heads
Real-time systems
Data models
Complexity theory
fusion model
machine learning
real-time
inertial movement sensors
head gestures
healthy interactions
wearables
- Language
- ISSN
- 1558-058X
Modern sensor technology has contributed to the study of human behaviors during social interactions. Inertial movement units (IMUs) have shown great promise in the recognition of communication cues displayed by head gestures, which are important for healthy interactions. However, no gold standard exists to automatically detect head actions from IMUs. This paper presents the design of a real-time head-action detection (HAD) unit based on a new real-time fusion model architecture approach. An analysis of buffer sizes and feature contribution using a decision tree (DT) classifier and a predictor importance fusion is presented. The fusion model is composed of two classification stages wherein the first stage focus on recognizing head position and the second on recognizing head motion. The designed HAD unit uses a data buffer size of 3s, 7 features in total, and a DT classifier. Results show a testing accuracy of 97.91% and an F1-score of 98.5%. The use of the designed HAD unit and its architecture could allow for easy re-training to add recognition of additional head actions by having specialized head action classification models.