View Chapter

Chapter 72 — Social Robotics

Cynthia Breazeal, Kerstin Dautenhahn and Takayuki Kanda

This chapter surveys some of the principal research trends in Social Robotics and its application to human–robot interaction (HRI). Social (or Sociable) robots are designed to interact with people in a natural, interpersonal manner – often to achieve positive outcomes in diverse applications such as education, health, quality of life, entertainment, communication, and tasks requiring collaborative teamwork. The long-term goal of creating social robots that are competent and capable partners for people is quite a challenging task. They will need to be able to communicate naturally with people using both verbal and nonverbal signals. They will need to engage us not only on a cognitive level, but on an emotional level as well in order to provide effective social and task-related support to people. They will need a wide range of socialcognitive skills and a theory of other minds to understand human behavior, and to be intuitively understood by people. A deep understanding of human intelligence and behavior across multiple dimensions (i. e., cognitive, affective, physical, social, etc.) is necessary in order to design robots that can successfully play a beneficial role in the daily lives of people. This requires a multidisciplinary approach where the design of social robot technologies and methodologies are informed by robotics, artificial intelligence, psychology, neuroscience, human factors, design, anthropology, and more.

Home-assistance companion robot in the Robot House

Author  Kerstin Dautenhahn

Video ID : 218

The video results from the research as part of the three-year European Project Accompany (http://accompanyproject.eu/). It shows the year-one scenario. Later scenarios were subsequently used for cumulative evaluation studies with elderly users and their carer-givers in three European countries. This video shows the year-one scenario as it was implemented in the University of Hertfordshire Robot House.

Visual communicative nonverbal behaviors of the Sunflower robot

Author  Kerstin Dautenhahn

Video ID : 219

The video illustrates the experiments as described in Koay et. al (2013). The Sunflower robot, developed by Kheng Lee Koay at the University of Hertfordshire, is a non-humanoid robot, using communicative signals inspired by dog-human interaction. The biological behaviors had been abstracted and translated to the specific robot embodiment. The results show that the robot is able to communicate its intention to a person and encourages the participant to attend to events and locations in a home environment. The work has been part of the of the European project LIREC (http://lirec.eu/project).

Playing triadic games with KASPAR

Author  Kerstin Dautenhahn

Video ID : 220

The video illustrates (using researchers taking the roles of children) the system developed by Joshua Wainer as part of his PhD research at University of Hertfordshire. In this study, KASPAR was developed to fully autonomously play games with pairs of children with autism. The robot provides encouragement, motivation and feedback, and 'joins in the game'. The system was evaluated in long-term studies with children with autism (J. Wainer et al. 2014). Results show that KASPAR encourages collaborative skills in children with autism.

Explaining a typical session with Sunflower as a home companion in the Robot House

Author  Kerstin Dautenhahn

Video ID : 221

The video illustrates and explains one of the final showcases of the European project LIREC (http://lirec.eu/project) in the University of Hertfordshire Robot House. The Sunflower robot, developed at UH, provides cognitive and physical assistance in a home scenario. In the video, one of the researchers, Dag Syrdal, explains a typical session in long-term evaluation studies in the Robot House. Sunflower has access to a network of smart sensors in the Robot House. The video also illustrates the concept of migration (moving of the robot's mind/AI to a differently embodied system).

A robot that forms a good spatial formation

Author  Takayuki Kanda

Video ID : 257

The video illustrates one of capabilities of social robots developed for making interaction with people smooth and natural. With the developed technique, the robot has the capability to detect the attention of the user based on his location and to adjust its standing position so that it forms a good spatial formation, in which they can easily talk about the object of their attention. In the video, when the user looks around for the computers in a room, the robot moves to a location where it is convenient to explain the computers.

A robot that approaches pedestrians

Author  Takayuki Kanda

Video ID : 258

This video illustrates an example of a study in which the social robot's capability for nonverbal interaction was developed. In the study, an anticipation technique was developed, where the robot observes pedestrians' motions and anticipates each pedestrian's future motions thanks to the accumulation of a large amount of data on pedestrian trajectories. Then, it plans its motion to approach a pedestrian from a frontal direction and initiates a conversation with the pedestrian.

A robot that provides a direction based on the model of the environment

Author  Takayuki Kanda

Video ID : 259

The video shows a scene of direction-giving interaction. The robot communicates the way to reach the destination with pointing in the direction to go. This interaction is supported with its capability to understand the environment. That is, the robot possesses the model of the environment, like a geographical map, topology, and landmarks from a first-person perspective, the so called route-perspective model.

Human-robot teaming in a search-and-retrieve task

Author  Cynthia Breazeal

Video ID : 555

This video shows an example from a human participant study examining the role of nonverbal social signals on human-robot teamwork for a complex search-and-retrieve task. In a controlled experiment, we examined the role of backchanneling and task complexity on team functioning and perceptions of the robots’ engagement and competence. Seventy three participants interacted with autonomous humanoid robots as part of a human-robot team: One participant, one confederate (a remote operator controlling an aerial robot), and three robots (2 mobile humanoids and an aerial robot). We found that, when robots used backchanneling, team functioning improved and the robots were seen as more engaged.

Social referencing behavior

Author  Cynthia Breazeal

Video ID : 556

This video is an example of how nonverbal and verbal communication, emotive behavior, and social learning integrate to support social referencing in human-robot interaction. The robot, Leonardo, learns the affective appraisal of two novel objects by reading the affective appraisal given by a person (via facial expression, tone of voice, and word choice). The robot uses joint attention mechanisms to understand the referent of the interaction, and learns to associate the affective appraisal with this novel object. The robot then uses its own emotive responses to engage with that object accordingly (e.g., approach and explore a positively appraised object, avoid a negatively appraised object).

Overview of Kismet's expressive behavior

Author  Cynthia Breazeal

Video ID : 557

This video presents an overview of Kismet's expressive behavior and rationale. The video presents how Kismet can express internal emotive/affective states through three modalities: facial expression, vocal affect, and body posture. The video also shows how Kismet can recognize aspects of affective intent in human speech (e.g., praising, scolding, soothing, and attentional bids). The video shows how human participants can interact in a natural and intuitive way with the robot, by reading and responding to its emotive and social cues.