Robot-Robot Gesturing for Anchoring Representations

Loading...
Thumbnail Image

Access rights

openAccess

URL

Journal Title

Journal ISSN

Volume Title

A1 Alkuperäisartikkeli tieteellisessä aikakauslehdessä

Date

2019-02-01

Major/Subject

Mcode

Degree programme

Language

en

Pages

15

Series

IEEE Transactions on Robotics

Abstract

In a multirobot system, using shared symbols for objects in the environment is a prerequisite for collaboration. Sharing symbols requires that each agent has anchored a symbol with an internal, sensor level representation, as well as that these symbols match between the agents. The problem can be solved easily when the internal representations can be communicated between the agents. However, with heterogeneous embodiments the available sensors are likely to differ, making it impossible to share the internal representations directly. We propose the use of pointing gestures to align symbols between a heterogeneous group of robots. We describe a planning framework that minimizes the required effort for anchoring representations across robots. The framework allows planning for both the gesturing and observing agents in a decentralized fashion. It considers both implicit sources of failure, such as ambiguous pointing, as well as costs required by actions. Simulation experiments demonstrate that the resulting planning problem has a complex solution structure with multiple local minima. Demonstration with a heterogeneous two-robot system shows the practical viability of this approach.

Description

| openaire: EC/FP7/600825/EU//RECONFIG

Keywords

Cognitive robotics, Grounding, Multi-robot systems, multi-robot systems, Planning, Robot kinematics, Robot sensing systems, symbol grounding, Tracking

Other note

Citation

Kondaxakis, P, Gulzar, K, Kinauer, S, Kokkinos, I & Kyrki, V 2019, ' Robot-Robot Gesturing for Anchoring Representations ', IEEE Transactions on Robotics, vol. 35, no. 1, 8502843, pp. 216-230 . https://doi.org/10.1109/TRO.2018.2875388