NCCR Robotics is a consortium of robotics laboratories across Switzerland, working on robots for improving the quality of life and to strengthen robotics in Switzerland and worldwide. Newsletter
Have you ever dreamed of flying? The Symbiotic Drone Activity is a project that aims to give you the sensation of flying while controlling a real drone. The goal of… Read more
Our partner institutions current offer two courses that have a strong focus on robotics at Master’s level, although it is worth noting that students with a wide variety of backgrounds… Read more
Intelligent Robots for Improving the Quality of Life The National Centre of Competence in Research (NCCR) Robotics is a Swiss nationwide organisation funded by the Swiss National Science Foundation… Read more
NCCR Robotics publishes open source software and datasets, please see below for a list and links to where they can be downloaded. Robogen RoboGen™ is an open source platform… Read more
For a summary of our activities please download our info pack. Need more information? Contact our Communications Officer:
RSS 2018 Tutorial on Dynamical System-based Learning from Demonstration, co-organised by Aude Billard, NCCR Robotics PI, will take place on June 29th at Carnegie Mellon University, Pittsburgh. More information here: https://epfl-lasa.github.io/TutorialRSS2018.io/
The date of CYBATHLON 2020 is fixed! From 2–3 May 2020 the gates will be opened for the continuation of the CYBATHLON at the SWISS Arena in Kloten near Zurich. Prepare yourself for an arena charged with passion and an emotion-filled audience that is inspired by the exciting races and challenging tasks in the six …
NCCR drones can now be effortlessly controlled with pointing gestures. A video demonstration of the system developed by IDSIA has been published at the Human-Robot Interaction (HRI 2018) conference, March 5-8, 2018, Chicago, IL, USA. More info: http://people.idsia.ch/~gromov/hri-landing/
Recent advances in soft robotics have seen the development of soft pneumatic actuators (SPAs) to ensure that all parts of the robot are soft, including the functional parts. These SPAs have traditionally used increased pressure in parts of the actuator to initiate movement, but today a team from NCCR Robotics and RRL, EPFL publish a …
Using Brain Computer Interfaces (BCI) as a way to give people with locked-in syndrome back reliable communication and control capabilities has long been a futuristic trope of medical dramas and sci fi. A team from NCCR Robotics and CNBI, EPFL have recently published a paper detailing work as a step towards taking this technique into everyday lives …
When training to regain movement after stroke or spinal cord injury (SCI), patients must once again learn how to keep their balance during walking movements. Current clinical methods mean supporting the weight of the patient during movement, setting the body off balance and meaning that when patients are ready to begin to walk without mechanical …
29 Oct – 31 Oct 2018
10:00 am – 6:00 pm
|Conference on Robot Learning (CoRL 2018)||CoRL 2018 will take place on October 29-31 2018 in Zurich. The conference focuses on the intersection of robotics and machine learning. CoRL aims at being a selective, top-tier venue...|
15 Jun – 16 Jun 2017
Building Bodies for Brains & Brains for Bodies & 3rd Japan-EU Workshop on Neurorobotics
|Building Bodies for Brains & Brains for Bodies & 3rd Japan-EU Workshop on Neurorobotics Registration for both events now open.|
4 Nov 2016
4:15 pm – 5:15 pm
Talk: Designing and Controlling Robots for Direct Interaction with Humans by Prof. Alin Albu-Schaeffer, German Aerospace Center, Germany.
ETH Zurich, HG G3, Zurich
9 Oct – 12 Oct 2016
WORKSHOP ON BRAIN-MACHINE INTERFACES (SMC 2016)
Intercontinental Hotel, BUDAPEST, 1052 Budapest
|Please see: https://documents.epfl.ch/users/c/ch/chavarri/www/IEEESMC2016_BMI/BMI-IEEESMC2016.html|
Looking for publications? You might want to consider searching on the EPFL Infoscience site which provides advanced publication search capabilities.
Can robots in classroom reshape K-12 STEM education, and foster new ways of learning? To sketch an answer, this article reviews, side-by-side, existing literature on robot-based learning activities featuring mathematics and physics (purposefully putting aside the well-studied field of "robots to teach robotics") and existing robot platforms and toolkits suited for classroom environment (in terms of cost, ease of use, orchestration load for the teacher, etc.). Our survey suggests that the use of robots in classroom has indeed moved from purely technology to education, to encompass new didactic fields. We however identified several shortcomings, in terms of robotic platforms and teaching environments, that contribute to the limited presence of robotics in existing curricula; the lack of specific teacher training being likely pivotal. Finally, we propose an educational framework merging the tangibility of robots with the advanced visibility of augmented reality.
What encourages people to refer to a robot as if it was a living being? Is it because of the robot’s humanoid or animal-like shape, its movements or rather the kind of inter- action it enables? We aim to investigate robots’ characteristics that lead people to anthropomorphize it by comparing different kinds of robotic devices and contrasting it to an interactive technology. We addressed this question by comparing anthro- pomorphic language in online forums about the Roomba robotic vacuum cleaner, the AIBO robotic dog, and the iPad tablet computer. A content analysis of 750 postings was carried out. We expected to find the highest amount of anthropomorphism in the AIBO forum but were not sure about how far people referred to Roomba or the iPad as a lifelike artifact. Findings suggest that people anthropomorphize their robotic dog signifi- cantly more than their Roomba or iPad, across different topics of forum posts. Further, the topic of the post had a significant impact on anthropomorphic language.
In this literature review we explain anthropomorphism and its role in the design of socially interactive robots and human-robot interaction. We illus-trate the social phenomenon of anthropomorphism which describes people’s tendency to attribute lifelike qualities to objects and other non lifelike artifacts. We present theoretical backgrounds from social sciences, and integrate related work from robotics research, including results from experiments with social ro-bots. We present different approaches for anthropomorphic and humanlike form in a robot’s design related to its physical shape, its behavior, and its interaction with humans. This review provides a comprehensive understanding of anthro-pomorphism in robotics, collects and reports relevant references, and gives an outlook on anthropomorphic human-robot interaction.
In this article, we present Cellulo, a novel robotic platform that investigates the intersection of three ideas for robotics in education: designing the robots to be versatile and generic tools; blending robots into the classroom by designing them to be pervasive objects and by creating tight interactions with (already pervasive) paper; and finally considering the practical constraints of real classrooms at every stage of the design. Our platform results from these considerations and builds on a unique combination of technologies: groups of handheld haptic-enabled robots, tablets and activity sheets printed on regular paper. The robots feature holonomic motion, haptic feedback capability and high accuracy localization through a microdot pattern overlaid on top of the activity sheets, while remaining affordable (robots cost about EUR 125 at the prototype stage) and classroom-friendly. We present the platform and report on our first interaction studies, involving about 230 children.
Measuring “how much the human is in the interaction” — the level of engagement — is instrumental in building effective interactive robots. Engagement, however, is a complex, multi-faceted cognitive mechanism that is only indirectly observable. This article formalizes with-me-ness as one of such indirect measures. With-me-ness, a concept borrowed from the field of Computer-Supported Collaborative Learning, measures in a well-defined way to what extent the human is with the robot over the course of an interactive task. As such, it is a meaningful precursor of engagement. We expose in this paper the full methodology, from real-time estimation of the human’s focus of attention (relying on a novel, open-source, vision-based head pose estimator), to on-line computation of with-me-ness. We report as well on the experimental validation of this approach, using a naturalistic setup involving children during a complex robot-teaching task.
"Ranger" is a robotic box designed to motivate young children to tidy up the toys in their rom. It explores the idea of integrating robotics into daily life objects, such as a wooden box. The box shows light and sound when toys are put inside or removed. We carried out a series of field experiments (Wizard-of-Oz) with 14 families to evaluate the first prototype of Ranger. The robot was operated showing two different behaviors: an active or a passive one. We found that the robot’s behavior had an impact on how children interacted with it. The poster also describes children’s and parent’s evaluation of the robot and how the design of Ranger could be improved.
Personal service robots, such as the iRobot Roomba vacuum cleaner provide a promising opportunity to study human-robot interaction (HRI) in domestic environments. Still rather little is known about long-term impacts of robotic home appliances on people’s daily routines and attitudes and how they evolve over time. We investigate these aspects through a longitudinal ethnographic study with nine households, to which we gave a Roomba cleaning robot. During six months, data is gathered through a combination of qualitative and quantitative methods.
In this paper, we explored the effect of a robot’s subconscious gestures made during moments when idle (also called adaptor gestures) on anthropomorphic perceptions of five year old children. We developed and sorted a set of adaptor motions based on their intensity. We designed an experiment involving 20 children, in which they played a memory game with two robots. During moments of idleness, the first robot showed adaptor movements, while the second robot moved its head following basic face tracking. Results showed that the children perceived the robot displaying adaptor movements to be more human and friendly. Moreover, these traits were found to be proportional to the intensity of the adaptor movements. For the range of intensities tested, it was also found that adaptor movements were not disruptive towards the task. These findings corroborate the fact that adaptor movements improve the affective aspect of child-robot interactions (CRI) and do not interfere with the child’s performances in the task, making them suitable for CRI in educational contexts.