A general framework for sensor-based human activity recognition

Lukas Köping*, Kimiaki Shirahama, Marcin Grzegorzek

*Corresponding author for this work
5 Citations (Scopus)

Abstract

Today's wearable devices like smartphones, smartwatches and intelligent glasses collect a large amount of data from their built-in sensors like accelerometers and gyroscopes. These data can be used to identify a person's current activity and in turn can be utilised for applications in the field of personal fitness assistants or elderly care. However, developing such systems is subject to certain restrictions: (i) since more and more new sensors will be available in the future, activity recognition systems should be able to integrate these new sensors with a small amount of manual effort and (ii) such systems should avoid high acquisition costs for computational power. We propose a general framework that achieves an effective data integration based on the following two characteristics: Firstly, a smartphone is used to gather and temporally store data from different sensors and transfer these data to a central server. Thus, various sensors can be integrated into the system as long as they have programming interfaces to communicate with the smartphone. The second characteristic is a codebook-based feature learning approach that can encode data from each sensor into an effective feature vector only by tuning a few intuitive parameters. In the experiments, the framework is realised as a real-time activity recognition system that integrates eight sensors from a smartphone, smartwatch and smartglasses, and its effectiveness is validated from different perspectives such as accuracies, sensor combinations and sampling rates.

Original languageEnglish
JournalComputers in Biology and Medicine
Volume95
Pages (from-to)248-260
Number of pages13
ISSN0010-4825
DOIs
Publication statusPublished - 01.04.2018

Fingerprint

Dive into the research topics of 'A general framework for sensor-based human activity recognition'. Together they form a unique fingerprint.

Cite this