Student Projects

List of student projects for Spring 2019


Algorithmic Human-Robot Interaction [a-HRI] :robot:

Human-Robot Collaboration [HRC] is tasked with designing algorithms and systems for humans and robots to work together. At its highest level of abstraction, we want to maximize the human-robot team efficiency, in order for our robots to be better collaborators and workers. Crucial to this is the development of interfaces that are as natural and transparent to the user as possible, in order to reduce the barrier to entry for humans to take advantage of the human-robot team. This requires scientists to draw upon research from other disciplines (Human-Computer Interaction, Neurosciences, Psychology, Cognitive Sciences, Natural Language Processing, Machine Learning, Computer Vision) in order to develop systems that are able to detect what the human partner is doing, predict what she will do next, and efficiently exchange information with her.

Below, there is a non-comprehensive list of projects in this area. If you are interested in any of them, please send me an email!! Feel free to integrate and/or expand on this with your own ideas and interests.

  • Language enabled agents
  • Human detection and sensing
  • Augmented/virtual reality
  • Reinforcement learning for HRC
  • Cognitive systems to transfer knowledge to and from the human (humans teaching robots, robots teaching humans)
  • High-level transfer learning and self adaptation to unforeseen circumstances

Physical Human-Robot Interaction [p-HRI] :robot:

One of the current limitations of even the most advanced robots is the fact that their capabilities are limited to very simple actions (e.g., pick-and-place). A new generation of robot controllers and motion planners is needed for robots to be more capable and useful in most of the domains they are currently employed (factories, search and rescue) and the domains they will be employed in the near future (hospitals, households, roads). In particular, robots should be able to perform better when they are co-located with humans, and should not shy away from engaging in close, elbow-to-elbow collaboration with them.

To date, robot controllers largely concentrate on the end-point as the only part that enters in physical contact with the environment. The rest of the body is typically represented as a kinematic chain, the volume and surface of the body itself rarely taken into account. Sensing is dominated by “distal” sensors, like cameras, whereas the body surface is “numb”. As a consequence, reaching in cluttered, unstructured environments poses severe problems, as the robot is largely unaware of the full occupancy of its body, limiting the safety of the robot and the surrounding environment. This is one of the bottlenecks that prevent robots from working alongside human partners. [Roncone et al. 2015]

We think we should do better. We think that, if we equip robots with enough information about their surroundings, we can improve their interaction capabilities with the outside world and humans. In particular, artificial tactile systems will be a key enabler to these types of technologies.

Below, there is a non-comprehensive list of projects in this area. If you are interested in any of them, please send me an email!! Feel free to integrate and/or expand on this with your own ideas and interests.

  • Development of Artificial Tactile Sensors
  • Distributed robot controllers
  • Reaching in clutter
  • Self-supervised learning of the body model (i.e. the robot kinematics)
  • Exploring tool use and object affordances
  • Immersive, hands-free tele-operation


Personal website of Alessandro Roncone, also known as alecive, Ph.D. in robotics engineering, computer scientist, interaction designer, father and runner. From Fall 2018, he is an as Assistant Professor in the CS Department at University of Colorado Boulder.

Alessandro has more than nine years' research experience in robotics. He worked full time with the iCub, one of the most advanced humanoid robots, on machine perception and artificial intelligence. His mastery of C++ and the YARP/ROS software architectures has been employed for research in human-robot interaction, kinematics, tactile perception and robot control. I am looking for students! [LINK]