Browsing by Author "Kondaxakis, Polychronis"
Now showing 1 - 2 of 2
- Results Per Page
- Sort Options
- Interface Design for Physical Human-Robot Interaction using sensorless control
Sähkötekniikan korkeakoulu |(2013-10-24) Dabrowski, JacekThe rapid increase in the usage of robots has made interaction between a human and a robot a crucial field of research. Physical human–robot interaction constitutes a relevant and growing research area. Nowadays robots are used in almost all areas of life, such as in households, for education and in medicine. Therefore, many research studies are being conducted on ergonomic human–robot interfaces enabling people to communicate, collaborate and to teach a robot through physical interaction.This thesis is focused on developing a physical human-robot interface by means of which the user is able to control a walking humanoid by exerting force. Through physical contact with the robot arm, a human can influence the direction and velocity of the robot walk. In other words, the user leads the humanoid by the hand, and the robot compensates this external force by following the user.The developed interface offers a method of sensorless force control. Instead of the traditional approach using force/torque measurement, the fact that a DC motor’s torque is proportional to the armature current was applied. Two different control algorithms were implemented and compared. Consequently, a usability test was conducted for different interfaces to find the one which was the most ergonomic. - Robot-Robot Gesturing for Anchoring Representations
A1 Alkuperäisartikkeli tieteellisessä aikakauslehdessä(2019-02-01) Kondaxakis, Polychronis; Gulzar, Khurram; Kinauer, Stefan; Kokkinos, Iasonas; Kyrki, VilleIn a multirobot system, using shared symbols for objects in the environment is a prerequisite for collaboration. Sharing symbols requires that each agent has anchored a symbol with an internal, sensor level representation, as well as that these symbols match between the agents. The problem can be solved easily when the internal representations can be communicated between the agents. However, with heterogeneous embodiments the available sensors are likely to differ, making it impossible to share the internal representations directly. We propose the use of pointing gestures to align symbols between a heterogeneous group of robots. We describe a planning framework that minimizes the required effort for anchoring representations across robots. The framework allows planning for both the gesturing and observing agents in a decentralized fashion. It considers both implicit sources of failure, such as ambiguous pointing, as well as costs required by actions. Simulation experiments demonstrate that the resulting planning problem has a complex solution structure with multiple local minima. Demonstration with a heterogeneous two-robot system shows the practical viability of this approach.