Have you ever dreamed of flying? The Symbiotic Drone Activity is a project that aims to give you the sensation of flying while controlling a real drone. The goal of… Read more
While the use of technology has now become so useful in most jobs, our society need to move towards better education in technology in general and robotics in particular,… Read more
NCCR Robotics supports and promotes seminars and talks by invited speakers in the partner institutions. addd Diego Pardos talk from Feb 2017 RI Seminar: Davide Scaramuzza : Micro… Read more
Looking for publications? You might want to consider searching on the EPFL Infoscience site which provides advanced publication search capabilities.
Robot Learning from Demonstration (RLfD) has been identified as a key element for making robots useful in daily lives. A wide range of techniques has been proposed for deriving a task model from a set of demonstrations of the task. Most previous works use learning to model the kinematics of the task, and for autonomous execution the robot then relies on a stiff position controller. While many tasks can and have been learned this way, there are tasks in which controlling the position alone is insufficient to achieve the goals of the task. These are typically tasks that involve contact or require a specific response to physical perturbations. The question of how to adjust the compliance to suit the need of the task has not yet been fully treated in Robot Learning from Demonstration. In this paper, we address this issue and present interfaces that allow a human teacher to indicate compliance variations by physically interacting with the robot during task execution. We validate our approach in two different experiments on the 7 DoF Barrett WAM and KUKA LWR robot manipulators. Furthermore, we conduct a user study to evaluate the usability of our approach from a non-roboticists perspective.