For a summary of our activities please download our press pack. Need more information? Contact our Communications Officer:
Using Brain Computer Interfaces (BCI) as a way to give people with locked-in syndrome back reliable communication and control capabilities has long been a futuristic trope of medical dramas and sci fi. A team from NCCR Robotics and CNBI, EPFL have recently published a paper detailing work as a step towards taking this technique into everyday lives …
25 Jul 2017
|ROBOTIK-LABOR AN DER ETH ZÜRICH - TeleZüri Sendung||Tune into TeleZüri at 18:30 to hear Robert Riener speaking about all things rehabilitation robotics and Cybathlon. http://www.telezueri.ch/64-show-sommertalk|
Looking for publications? You might want to consider searching on the EPFL Infoscience site which provides advanced publication search capabilities.
Modern wearable robots are not yet intelligent enough to fully satisfy the demands of endusers, as they lack the sensor fusion algorithms needed to provide optimal assistance and react quickly to perturbations or changes in user intentions. Sensor fusion applications such as intention detection have been emphasized as a major challenge for both robotic orthoses and prostheses. In order to better examine the strengths and shortcomings of the field, this paper presents a review of existing sensor fusion methods for wearable robots, both stationary ones such as rehabilitation exoskeletons and portable ones such as active prostheses and full-body exoskeletons. Fusion methods are first presented as applied to individual sensing modalities (primarily electromyography, electroencephalography and mechanical sensors), and then four approaches to combining multiple modalities are presented. The strengths and weaknesses of the different methods are compared, and recommendations are made for future sensor fusion research.
Several design strategies for rehabilitation robotics have aimed to improve patients’ experiences using motivating and engaging virtual environments. This paper presents a new design strategy: enhancing patient freedom with a complex virtual environment that intelligently detects patients’ intentions and supports the intended actions. A `virtual kitchen’ scenario has been developed in which many possible actions can be performed at any time, allowing patients to experiment and giving them more freedom. Remote eye tracking is used to detect the intended action and trigger appropriate support by a rehabilitation robot. This approach requires no additional equipment attached to the patient and has a calibration time of less than a minute. The system was tested on healthy subjects using the ARMin III arm rehabilitation robot. It was found to be technically feasible and usable by healthy subjects. However, the intention detection algorithm should be improved using better sensor fusion, and clinical tests with patients are needed to evaluate the system’s usability and potential therapeutic benefits.
Several strategies have been proposed to improve patient motivation and exercise intensity during robot-aided stroke rehabilitation. One relatively unexplored possibility is two-player gameplay, allowing subjects to compete or cooperate with each other to achieve a common goal. In order to explore the potential of such games, we designed a two-player game played using two ARMin arm rehabilitation robots.