Abstract
Since low cost RGB-D sensors have been available, gesture detection has gained more and more interest in the field of human computer and human robot interaction. It is possible to navigate through interactive menus by waving the hand and to confirm menu items by pointing at them. Such applications require real-time body or hand-finger pose estimation algorithms. This paper presents a kinematic approach to estimate the full pose of the hand including the finger joints' angles. A self-scaling kinematic hand skeleton model is presented and fitted into the 3D data of the hand in real-time on standard hardware with up to 30 frames per second without using a GPU. This approach is based on least-square minimization and an intelligent choice of the error function. The tracking accuracy is evaluated based on a recorded dataset as well as simulated data. Qualitative results are presented emphasizing the tracking ability under hard conditions like full hand turning and self-occlusion.
Original language | English |
---|---|
Pages | 185-196 |
Number of pages | 12 |
DOIs | |
Publication status | Published - 2015 |
Event | 10th International Conference on Computer Vision Theory and Applications - Berlin, Germany Duration: 11.04.2015 → 14.04.2015 Conference number: 112690 |
Conference
Conference | 10th International Conference on Computer Vision Theory and Applications |
---|---|
Abbreviated title | VISAPP 2015 |
Country/Territory | Germany |
City | Berlin |
Period | 11.04.15 → 14.04.15 |