In social robotics, robots needs to be able to be understood by humans. Especially in collaborative tasks where they have to share mutual knowledge. For instance, in an educative scenario, learners share their knowledge and they must adapt their behaviour in order to make sure they are understood by others. Learners display behaviours in order to show their understanding and teachers adapt in order to make sure that the learners’ knowledge is the required one. This ability requires a model of their own mental states perceived by others: “has the human understood that I(robot) need this object for the task or should I explain it once again ?" In this paper, we discuss the importance of a cognitive architecture enabling second-order Mutual Modelling for Human-Robot Interaction in educative contexts
Looking for publications? You might want to consider searching on the EPFL Infoscience site which provides advanced publication search capabilities.
We present the design approach and evaluation of our proto- type called “Ranger”. Ranger is a robotic toy box that aims to motivate young children to tidy up their room. We evalu- ated Ranger in 14 families with 31 children (2-10 years) using the Wizard-of-Oz technique. This case study explores two different robot behaviors (proactive vs. reactive) and their impact on children’s interaction with the robot and the tidy- ing behavior. The analysis of the video recorded scenarios shows that the proactive robot tended to encourage more playful and explorative behavior in children, whereas the reactive robot triggered more tidying behavior. Our find- ings hold implications for the design of interactive robots for children, and may also serve as an example of evaluating an early version of a prototype in a real-world setting.