Robust and Deployable Gesture Recognition for Smartwatches
Loading...
Access rights
openAccess
publishedVersion
URL
Journal Title
Journal ISSN
Volume Title
A4 Artikkeli konferenssijulkaisussa
This publication is imported from Aalto University research portal.
View publication in the Research portal (opens in new window)
View/Open full text file from the Research portal (opens in new window)
View publication in the Research portal (opens in new window)
View/Open full text file from the Research portal (opens in new window)
Date
Major/Subject
Mcode
Degree programme
Language
en
Pages
15
Series
27th International Conference on Intelligent User Interfaces, IUI 2022, pp. 277-291, International Conference on Intelligent User Interfaces, Proceedings IUI
Abstract
Gesture recognition on smartwatches is challenging not only due to resource constraints but also due to the dynamically changing conditions of users. It is currently an open problem how to engineer gesture recognisers that are robust and yet deployable on smartwatches. Recent research has found that common everyday events, such as a user removing and wearing their smartwatch again, can deteriorate recognition accuracy significantly. In this paper, we suggest that prior understanding of causes behind everyday variability and false positives should be exploited in the development of recognisers. To this end, first, we present a data collection method that aims at diversifying gesture data in a representative way, in which users are taken through experimental conditions that resemble known causes of variability (e.g., walking while gesturing) and are asked to produce deliberately varied, but realistic gestures. Secondly, we review known approaches in machine learning for recogniser design on constrained hardware. We propose convolution-based network variations for classifying raw sensor data, achieving greater than 98% accuracy reliably under both individual and situational variations where previous approaches have reported significant performance deterioration. This performance is achieved with a model that is two orders of magnitude less complex than previous state-of-the-art models. Our work suggests that deployable and robust recognition is feasible but requires systematic efforts in data collection and network design to address known causes of gesture variability.Description
Funding Information: This work was supported by the Department of Communications and Networking – Aalto University, Finnish Center for Artificial Intelligence (FCAI) and the Academy of Finland projects Human Automata (Project ID: 328813), BAD (Project ID: 318559), Huawei Technologies, and the Horizon 2020 FET program of the European Union (grant CHIST-ERA-20-BCI-001). Publisher Copyright: © 2022 ACM. Open Access fee has been paid, but the PDF version does not contain information on OA licence.
Keywords
Other note
Citation
Kunwar, U, Borar, S, Berghofer, M, Kylmälä, J, Aslan, I, Leiva, L A & Oulasvirta, A 2022, Robust and Deployable Gesture Recognition for Smartwatches. in 27th International Conference on Intelligent User Interfaces, IUI 2022. International Conference on Intelligent User Interfaces, Proceedings IUI, ACM, pp. 277-291, International Conference on Intelligent User Interfaces, Virtual, Online, Finland, 22/03/2022. https://doi.org/10.1145/3490099.3511125