Robot-Robot Gesturing for Anchoring Representations

dc.contributorAalto-yliopistofi
dc.contributorAalto Universityen
dc.contributor.authorKondaxakis, Polychronisen_US
dc.contributor.authorGulzar, Khurramen_US
dc.contributor.authorKinauer, Stefanen_US
dc.contributor.authorKokkinos, Iasonasen_US
dc.contributor.authorKyrki, Villeen_US
dc.contributor.departmentDepartment of Electrical Engineering and Automationen
dc.contributor.groupauthorIntelligent Roboticsen
dc.contributor.organizationBlue Ocean Roboticsen_US
dc.contributor.organizationUniversité Paris-Saclayen_US
dc.contributor.organizationUniversity College Londonen_US
dc.date.accessioned2018-12-10T10:26:30Z
dc.date.available2018-12-10T10:26:30Z
dc.date.issued2019-02-01en_US
dc.description| openaire: EC/FP7/600825/EU//RECONFIG
dc.description.abstractIn a multirobot system, using shared symbols for objects in the environment is a prerequisite for collaboration. Sharing symbols requires that each agent has anchored a symbol with an internal, sensor level representation, as well as that these symbols match between the agents. The problem can be solved easily when the internal representations can be communicated between the agents. However, with heterogeneous embodiments the available sensors are likely to differ, making it impossible to share the internal representations directly. We propose the use of pointing gestures to align symbols between a heterogeneous group of robots. We describe a planning framework that minimizes the required effort for anchoring representations across robots. The framework allows planning for both the gesturing and observing agents in a decentralized fashion. It considers both implicit sources of failure, such as ambiguous pointing, as well as costs required by actions. Simulation experiments demonstrate that the resulting planning problem has a complex solution structure with multiple local minima. Demonstration with a heterogeneous two-robot system shows the practical viability of this approach.en
dc.description.versionPeer revieweden
dc.format.extent15
dc.format.mimetypeapplication/pdfen_US
dc.identifier.citationKondaxakis, P, Gulzar, K, Kinauer, S, Kokkinos, I & Kyrki, V 2019, ' Robot-Robot Gesturing for Anchoring Representations ', IEEE Transactions on Robotics, vol. 35, no. 1, 8502843, pp. 216-230 . https://doi.org/10.1109/TRO.2018.2875388en
dc.identifier.doi10.1109/TRO.2018.2875388en_US
dc.identifier.issn1552-3098
dc.identifier.otherPURE UUID: acc0344a-a078-4eff-b31a-2167aa084263en_US
dc.identifier.otherPURE ITEMURL: https://research.aalto.fi/en/publications/acc0344a-a078-4eff-b31a-2167aa084263en_US
dc.identifier.otherPURE LINK: http://www.scopus.com/inward/record.url?scp=85055712092&partnerID=8YFLogxKen_US
dc.identifier.otherPURE FILEURL: https://research.aalto.fi/files/30097569/ELEC_Kondaxakis_etal_Robot_Robot_Gesturing_IEEETraRob_2018.pdfen_US
dc.identifier.urihttps://aaltodoc.aalto.fi/handle/123456789/35204
dc.identifier.urnURN:NBN:fi:aalto-201812106219
dc.language.isoenen
dc.relationinfo:eu-repo/grantAgreement/EC/FP7/600825/EU//RECONFIGen_US
dc.relation.ispartofseriesIEEE Transactions on Roboticsen
dc.rightsopenAccessen
dc.subject.keywordCognitive roboticsen_US
dc.subject.keywordGroundingen_US
dc.subject.keywordMulti-robot systemsen_US
dc.subject.keywordmulti-robot systemsen_US
dc.subject.keywordPlanningen_US
dc.subject.keywordRobot kinematicsen_US
dc.subject.keywordRobot sensing systemsen_US
dc.subject.keywordsymbol groundingen_US
dc.subject.keywordTrackingen_US
dc.titleRobot-Robot Gesturing for Anchoring Representationsen
dc.typeA1 Alkuperäisartikkeli tieteellisessä aikakauslehdessäfi
dc.type.versionacceptedVersion

Files