Real-time avoidance for Human-Robot Interaction


Physical-HRI-Controller

Reference paper: Compact real-time avoidance on a Humanoid Robot for Human-Robot Interaction [PDF] [BIB]

Authors: Phuong D. H. Nguyen, Matej Hoffmann, Alessandro Roncone, Ugo Pattacini, and Giorgio Metta

Submission: ACM/IEEE International Conference on Human-Robot Interaction (HRI2018), Chicago, IL, USA, March 5-8, 2018

We have developed a new framework for safe interaction of a robot with a human composed of the following main components (cf. Fig 1):

  1. a human 2D keypoints estimation pipeline employing a deep learning based algorithm, extended here into 3D using disparity;
  2. a distributed peripersonal space representation around the robot’s body parts;
  3. a new reaching controller that incorporates all obstacles entering the robot’s protective safety zone on the fly into the task.
portfolio/PhysicalHRIController_architecture.png

Figure 1. Software architecture for physical human-robot interaction. In this work, we develop a framework composed of: i) a human pose estimation algorithm, ii) a 2D to 3D disparity mapping, iii) a peripersonal space collision predictor, iv) a pHRI robot controller for distributed collision avoidance.

Here’s another video of the controller in action:

Controller in action. The controller can seamlessly process both tactile (physical contact) and visual (prior-to-contact) information in order to avoid obstacles while still optimizing its trajectory to accomplish the primary reaching task.

The main novelty lies in the formation of the protective safety margin around the robot’s body parts—in a distributed fashion and adhering closely to the robot structure—and its use in a reaching controller that dynamically incorporates threats in its peripersonal space into the task. The framework is tested in real experiments that reveal the effectiveness of this approach in protecting both human and robot against collisions during the interaction. Our solution is compact, self-contained (on-board stereo cameras in the robot’s head being the only sensor), and flexible, as different modulations of the defensive peripersonal space are possible—here we demonstrate stronger avoidance of the human head compared to rest of the body.

About


Personal website of Alessandro Roncone, also known as alecive, Ph.D. in robotics engineering, computer scientist, interaction designer, father and runner. From Fall 2018, he became Assistant Professor of Robotics in the CS Department at University of Colorado Boulder, where he directs the Human Interaction and RObotics [HIRO] Group.

Alessandro has more than fifteen years' research experience in robotics. He worked full time with the iCub, one of the most advanced humanoid robots, on machine perception and artificial intelligence. His mastery of C++ and the YARP/ROS software architectures has been employed for research in human-robot interaction, kinematics, tactile perception and robot control.

I am looking for students! [LINK]