Citation:
Raja , M , Ghaderi , V & Sigg , S 2018 , WiBot! In-vehicle behaviour and gesture recognition using wireless network edge . in Proceedings - 2018 IEEE 38th International Conference on Distributed Computing Systems, ICDCS 2018 . vol. 2018-July , International Conference on Distributed Computing Systems , Institute of Electrical and Electronics Engineers , pp. 376-387 , International Conference on Distributed Computing Systems , Vienna , Austria , 02/07/2018 . DOI: 10.1109/ICDCS.2018.00045
|
Abstract:
Recent advancements in vehicular technology have meant that integrated wireless devices such as Wi-Fi access points or bluetooth are deployed in vehicles at an increasingly dense scale. These vehicular network edge devices, while enabling in car wireless connectivity and infotainment services, can also be exploited as sensors to improve environmental and behavioural awareness that in turn can provide better and more personalised driver feedback and improve road safety. We present WiBot! a network-edge based behaviour recognition and gesture based personal assistant system for cars. WiBot leverages the vehicular network edge to detect distracted behaviour based on unusual head turns and arm movements during driving situations by monitoring radio frequency fluctuation patterns in real-time. Additionally, WiBot can recognise known gestures from natural arm movements while driving and use such gestures for passenger-car interaction. A key element of WiBot design is its impulsive windowing approach that allows start and end of gestures to be accurately identified in a continuous stream of data. We validate the system in a realistic driving environment by conducting a non-choreographed continuous recognition study with 40 participants at BMW Group Research, New Technologies and Innovation centre. By combining impulsive windowing with a unique selection of features from peaks and subcarrier analysis of RF CSI phase information, the system is able to achieve 94.5% accuracy for head-vs. arm movement separation. We can further confidently differentiate relevant gestures from random arm and head movements, head turns and idle movement with 90.5% accuracy.
|