The development of a Hardware-in-the-Loop test setup for event-based vision near-space space objects
Loading...
URL
Journal Title
Journal ISSN
Volume Title
Sähkötekniikan korkeakoulu |
Master's thesis
Unless otherwise stated, all rights belong to the author. You may download, display and print this publication for Your own personal use. Commercial use is prohibited.
Authors
Date
2023-06-12
Department
Major/Subject
Space Robotics and Automation
Mcode
ELEC3047
Degree programme
Erasmus Mundus Space Master
Language
en
Pages
93+9
Series
Abstract
The purpose of this thesis work was to develop a Hardware-in-the-Loop imaging setup that enables experimenting with an event-based and frame-based camera under simulated space conditions. The generated data sets were used to compare visual navigation algorithms in terms of an event-based and frame-based feature detection and tracking algorithm. The comparative analyses of the feature detection and tracking algorithms were used to get insights into the feasibility of event-based vision near-space space objects. Event-based cameras differ from frame-based cameras by how they produce an asynchronous and independent stream of events caused by brightness changes at each pixel instead of capturing images at a fixed rate. The setup design is based on a theoretical framework incorporating optical calculations. These calculations indicating the asteroid model needed to be scaled down by a factor of 3192 to fit inside the camera depth-of-view. This resulted in a scaled Bennu asteroid with a size of 16.44 centimeters. The cameras under testing conducted three experiments to generate data sets. The utilization of a feature detection and tracking algorithm on both camera data sets revealed that the absolute number of tracked features, computation time, and robustness in various scenarios of the frame-based camera algorithm outperforms the event-based camera algorithm. However, when considering the percentages of tracked features relative to the total detected features, the event-based algorithm tracks a significantly higher percentage of features for at least one key frame than the frame-based algorithm. The comparative analysis of the experiments performed in space-simulated conditions during this project showed that the feasibility of an event-based camera using solely events is low compared to the frame-based camera.Description
Supervisor
Kallio, EsaThesis advisor
Knuuttila, OlliKeywords
event-based camera, hardware-in-the-Loop, space-simulated conditions, near-space objects, asteroid, frame-based camera