Abstract
In the evolving field of human–computer interaction (HCI), gesture recognition has emerged as a critical focus, with smart gloves equipped with sensors playing one of the most important roles. Despite the significance of dynamic gesture recognition, most research on data gloves has concentrated on static gestures, with only a small percentage addressing dynamic gestures or both. This study explores the development of a low-cost smart glove prototype designed to capture and classify dynamic hand gestures for game control and presents a prototype of data gloves equipped with five flex sensors, five force sensors, and one inertial measurement unit (IMU) sensor. To classify dynamic gestures, we developed a neural network-based classifier, utilizing a convolutional neural network (CNN) with three two-dimensional convolutional layers and rectified linear unit (ReLU) activation where its accuracy was 90%. The developed glove effectively captures dynamic gestures for game control, achieving high classification accuracy, precision, and recall, as evidenced by the confusion matrix and training metrics. Despite limitations in the number of gestures and participants, the solution offers a cost-effective and accurate approach to gesture recognition, with potential applications in VR/AR environments.
Original language | English |
---|---|
Article number | 6157 |
Journal | Sensors (Basel, Switzerland) |
Volume | 24 |
Issue number | 18 |
ISSN | 1424-8220 |
DOIs | |
Publication status | Published - 23.09.2024 |
Research Areas and Centers
- Academic Focus: Biomedical Engineering
- Centers: Center for Artificial Intelligence Luebeck (ZKIL)