Motion detection and classification : ultra-fast road user detection
A1 Alkuperäisartikkeli tieteellisessä aikakauslehdessä
Journal of Big Data, Volume 9, issue 1
AbstractWith the emerge of intelligent and connected transportation systems, driver perception and on-board safety systems could be extended with roadside camera units. Computer vision can be utilised to detect road users, conveying their presence to vehicles that cannot perceive them. However, accurate object detection algorithms are typically computationally heavy, depending on delay-prone cloud computation or expensive local hardware. Similar problems are faced in many intelligent transportation applications, in which road users are detected with a roadside camera. We propose utilising Motion Detection and Classification (MoDeCla) for road user detection. The approach is computationally lightweight and capable of running in real-time on an inexpensive single-board computer. To validate the applicability of MoDeCla in intelligent transportation applications, a detection benchmark was carried out on manually labelled data gathered from surveillance cameras overseeing urban areas in Espoo, Finland. Separate datasets were gathered during winter and summer, enabling comparison of the detectors in significantly different weather conditions. Compared to state-of-the-art object detectors, MoDeCla performed detection an order of magnitude faster, yet achieved similar accuracy. The most impactful deficiency of MoDeCla was errors in bounding box placement. Car headlights and long dark shadows were found especially difficult for the motion detection, which caused incorrect bounding boxes. Future improvements are also required for separately detecting overlapping road users.
Background subtraction, Convolutional neural networks, Intelligent transportation systems, Motion detection, Object detection, Winter conditions
Ojala , R , Vepsäläinen , J & Tammi , K 2022 , ' Motion detection and classification : ultra-fast road user detection ' , Journal of Big Data , vol. 9 , no. 1 , 28 . https://doi.org/10.1186/s40537-022-00581-8