Gesture Planning and Execution for Anchoring between Multi-Embodiment Robots in Decentralized Settings
Loading...
URL
Journal Title
Journal ISSN
Volume Title
School of Electrical Engineering |
Doctoral thesis (monograph)
| Defence date: 2020-11-13
Unless otherwise stated, all rights belong to the author. You may download, display and print this publication for Your own personal use. Commercial use is prohibited.
Authors
Date
2020
Major/Subject
Mcode
Degree programme
Language
en
Pages
116
Series
Aalto University publication series DOCTORAL DISSERTATIONS, 169/2020
Abstract
In robotics, competitiveness and cost effectiveness have compelled manufacturers to make specialized and less expensive robots. These specialized robots will eventually become part of the multi-robot system and will need to coordinate with each other to handle cooperative tasks. In a multi-robot system, diversity and heterogeneity of robotic agents facilitates capabilities, performance, autonomy, ease of use, and cost-effectiveness. Individually, these heterogeneous robots have limitations in communication, sensing and sharing knowledge, making cooperation and coordination among other robots quite challenging. The problem of collaboration can be easily solved when the internal representations can be communicated between the agents. However for heterogeneous robots, the available sensors are likely to differ, making it impossible to share the internal representations directly. In such a multi-robot system, using shared symbols for objects in the environment is a pre-requisite for collaboration. Sharing symbols requires both that each agent has anchored a symbol with an internal, sensor level representation, as well as the fact that these symbols match between the agents. Humans use gestures, such as pointing, extensively in order to anchor linguistic expressions to objects in the physical world. Similarly, gestures can be valuable in decentralized robotic systems, allowing communication between agents and the transfer of symbolic meanings. Pointing gestures are especially valuable in crowded scenes where multiple possible matches are present. However, pointing in crowded scenes can itself remain ambiguous if the pointing direction is not carefully chosen. We propose the use of pointing gestures to align symbols between a heterogeneous group of robots. In this thesis a probabilistic model for pointing and gesture detection accuracy is proposed. The model allows for the planning of optimal pointing actions by minimizing the probability of pointing errors resulting from ambiguities and limited accuracy. We also describe how to measure the accuracy of an agent's pointing gesture and to calibrate the model for that agent. Experimental results suggest that the proposed model captures the qualitative behaviour of pointing success well. We then describe a planning framework that minimizes the required effort for anchoring representations across robots. The framework allows planning for both the gesturing and observing agents in a decentralized fashion. It considers both implicit sources of failure such as ambiguous pointing as well as costs required by actions. Simulation experiments show that the resulting planning problem has a complex solution structure with multiple local minima. Furthermore, a real-life heterogeneous two-robot system demonstrates the practical viability of the approach.Description
13.11.2020 14:00 – 17:00 via Zoom: https://aalto.zoom.us/j/68188519899
Supervising professor
Kyrki, Ville, Prof., Aalto University, Department of Electrical Engineering and Automation, FinlandKeywords
anchoring representations, gesture communication, decentralized systems, heterogeneous robots, multi-robot coordination, gesture planning and execution