Real-Time Human Head Imitation for Humanoid Robots
English
Cazzato, Dario[University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) >]
Cimarelli, Claudio[University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) >]
Sanchez Lopez, Jose Luis[University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > >]
Olivares Mendez, Miguel Angel[University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > >]
Voos, Holger[University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > Engineering Research Unit]
Jul-2019
Proceedings of the 2019 3rd International Conference on Artificial Intelligence and Virtual Reality
65--69
Yes
3rd International Conference on Artificial Intelligence and Virtual Reality
from 27-07-2019 to 29-07-2019
[en] The ability of the robots to imitate human movements has been an active research study since the dawn of the robotics. Obtaining a realistic imitation is essential in terms of perceived quality in human-robot interaction, but it is still a challenge due to the lack of effective mapping between human movements and the degrees of freedom of robotics systems. If high-level programming interfaces, software and simulation tools simplified robot programming, there is still a strong gap between robot control and natural user interfaces. In this paper, a system to reproduce on a robot the head movements of a user in the field of view of a consumer camera is presented. The system recognizes the presence of a user and its head pose in real-time by using a deep neural network, in order to extract head position angles and to command the robot head movements consequently, obtaining a realistic imitation. At the same time, the system represents a natural user interface to control the Aldebaran NAO and Pepper humanoid robots with the head movements, with applications in human-robot interaction.