Have you ever dreamed of flying? The Symbiotic Drone Activity is a project that aims to give you the sensation of flying while controlling a real drone. The goal of… Read more
NCCR Robotics publishes open source software and datasets, please see below for a list and links to where they can be downloaded. Robogen RoboGen™ is an open source platform… Read more
Looking for publications? You might want to consider searching on the EPFL Infoscience site which provides advanced publication search capabilities.
This paper describes a brain-machine interface for the online control of a powered lower-limb exoskeleton based on electroencephalogram (EEG) signals recorded over the user’s sensorimotor cortical areas. We train a binary decoder that can distinguish two different mental states, which is applied in a cascaded manner to efficiently control the exoskeleton in three different directions: walk front, turn left and turn right. This is realized by first classifying the user’s intention to walk front or change the direction. If the user decides to change the direction, a subsequent classification is performed to decide turn left or right. The user’s mental command is conditionally executed considering the possibility of obstacle collision. All five subjects were able to successfully complete the 3-way navigation task using brain signals while mounted in the exoskeleton. We observed on average 10.2% decrease in overall task completion time compared to the baseline protocol.