NCCR Robotics is organised around a number of research groups and labs. The lead members of each lab are shown below, follow the links to learn more about each… Read more

Dr. R. Omar Chavez-Garcia Postdoctoral Researcher IDSIA, IDSIA

Can't see who you were looking for? You might want to try browsing by lab or looking in the A-Z people list.

Looking for publications? You might want to consider searching on the EPFL Infoscience site which provides advanced publication search capabilities.

A survey of sensor fusion methods in wearable robotics

  • Authors: Novak, Domen; Riener, Robert

Modern wearable robots are not yet intelligent enough to fully satisfy the demands of endusers, as they lack the sensor fusion algorithms needed to provide optimal assistance and react quickly to perturbations or changes in user intentions. Sensor fusion applications such as intention detection have been emphasized as a major challenge for both robotic orthoses and prostheses. In order to better examine the strengths and shortcomings of the field, this paper presents a review of existing sensor fusion methods for wearable robots, both stationary ones such as rehabilitation exoskeletons and portable ones such as active prostheses and full-body exoskeletons. Fusion methods are first presented as applied to individual sensing modalities (primarily electromyography, electroencephalography and mechanical sensors), and then four approaches to combining multiple modalities are presented. The strengths and weaknesses of the different methods are compared, and recommendations are made for future sensor fusion research.

Posted on: October 22, 2014

Optic-Flow Based Control of a 46g Quadrotor

  • Authors: Briod, Adrien; Zufferey, Jean-Christophe; Floreano, Dario

We aim at developing autonomous miniature hov- ering flying robots capable of navigating in unstructured GPS- denied environments. A major challenge is the miniaturization of the embedded sensors and processors allowing such platforms to fly autonomously. In this paper, we propose a novel ego-motion estimation algorithm for hovering robots equipped with inertial and optic-flow sensors that runs in real- time on a microcontroller. Unlike many vision-based methods, this algorithm does not rely on feature tracking, structure estimation, additional distance sensors or assumptions about the environment. Key to this method is the introduction of the translational optic-flow direction constraint (TOFDC), which does not use the optic-flow scale, but only its direction to correct for inertial sensor drift during changes of direction. This solution requires comparatively much simpler electronics and sensors and works in environments of any geometries. We demonstrate the implementation of this algorithm on a miniature 46g quadrotor for closed-loop position control.

Posted on: October 16, 2013