Abstract
Truly ubiquitous computing poses new and significant challenges. One of the key aspects that will condition the impact of these new tecnologies is how to obtain a manageable representation ofthe surrounding environment starting from simple sensing capabilities. This will make devices able to adapt their computing activities on an ever-changing environment. This paper presents a framework to promote unsupervised training processes among different sensors. This framework allows different sensors to exchange the needed knowledge to create a model to classify events. In particular we developed, as a case study, a multi-modal multi-sensor classification system combining data from a camera and a body-worn accelerometer to identify the user motion state. The body-worn accelerometer learns a model ofthe user behavior exploiting the information coming from the camera and uses it later on to classify the user motion in an autonomous way. Experiments demonstrate the accuracy ofthe proposed approach in different situations.
Original language | English |
---|---|
Title of host publication | 2008 Second IEEE International Conference on Self-Adaptive and Self-Organizing Systems Workshops |
Number of pages | 6 |
Publisher | IEEE |
Publication date | 01.12.2008 |
Pages | 61-66 |
Article number | 4800654 |
ISBN (Print) | 978-0-7695-3553-1 |
DOIs | |
Publication status | Published - 01.12.2008 |
Event | 2nd IEEE International Conference on Self-Adaptive and Self-Organizing Systems Workshops - Venice, Italy Duration: 20.10.2008 → 24.10.2008 Conference number: 75868 |