Quantcast

Sensors, Vol. 18, Pages 2355: Robot Imitation Learning of Social Gestures with Self-Collision Avoidance Using a 3D Sensor

Research paper by Tan Zhang, Wing-Yue Louie, Goldie Nejat, Beno Benhabib

Indexed on: 23 Jul '18Published on: 20 Jul '18Published in: Sensors (Basel, Switzerland)



Abstract

To effectively interact with people, social robots need to perceive human behaviors and in turn display their own behaviors using social communication modes such as gestures. The modeling of gestures can be difficult due to the high dimensionality of the robot configuration space. Imitation learning can be used to teach a robot to implement multi-jointed arm gestures by directly observing a human teacher’s arm movements (for example, using a non-contact 3D sensor) and then mapping these movements onto the robot arms. In this paper, we present a novel imitation learning system with robot self-collision awareness and avoidance. The proposed method uses a kinematical approach with bounding volumes to detect and avoid collisions with the robot itself while performing gesticulations. We conducted experiments with a dual arm social robot and a 3D sensor to determine the effectiveness of our imitation system in being able to mimic gestures while avoiding self-collisions.