View Chapter

Chapter 69 — Physical Human-Robot Interaction

Sami Haddadin and Elizabeth Croft

Over the last two decades, the foundations for physical human–robot interaction (pHRI) have evolved from successful developments in mechatronics, control, and planning, leading toward safer lightweight robot designs and interaction control schemes that advance beyond the current capacities of existing high-payload and highprecision position-controlled industrial robots. Based on their ability to sense physical interaction, render compliant behavior along the robot structure, plan motions that respect human preferences, and generate interaction plans for collaboration and coaction with humans, these novel robots have opened up novel and unforeseen application domains, and have advanced the field of human safety in robotics.

This chapter gives an overview on the state of the art in pHRI as of the date of publication. First, the advances in human safety are outlined, addressing topics in human injury analysis in robotics and safety standards for pHRI. Then, the foundations of human-friendly robot design, including the development of lightweight and intrinsically flexible force/torque-controlled machines together with the required perception abilities for interaction are introduced. Subsequently, motionplanning techniques for human environments, including the domains of biomechanically safe, risk-metric-based, human-aware planning are covered. Finally, the rather recent problem of interaction planning is summarized, including the issues of collaborative action planning, the definition of the interaction planning problem, and an introduction to robot reflexes and reactive control architecture for pHRI.

Torque control for teaching peg-in-hole via physical human-robot interaction

Author  Alin-Albu Schäffer

Video ID : 627

Teaching by demonstration is a typical application for impedance controllers. A practical demonstration was given with the task of teaching for automatic insertion of a piston into a motor block. Teaching is realized by guiding the robot with the human hand. It was initially known that the axes of the holes in the motor block were vertically oriented. In the teaching phase, high stiffness components for the orientations were commanded (150 Nm/rad), while the translational stiffness was set to zero. This allowed only translational movements to be demonstrated by the human operator. In the second phase, the taught trajectory has been automatically reproduced by the robot. In this phase, high values were assigned for the translational stiffness (3000 N/m), while the stiffness for the rotations was low (60 Nm/rad). This enabled the robot to compensate for the remaining position errors. For two pistons, the total time for the assembly was 6 s. In this experiment, the assembly was executed automatically four-times faster than by the human operator holding the robot as an input device in the teaching phase (24 s), while the free-hand execution of the task by a human requires about 4 s.

Flexible robot gripper for KUKA Light Weight Robot (LWR): Collaboration between human and robot

Author  Robotiq

Video ID : 632

Flexible robot gripper on KUKA Light Weight Robot engaged in a proximal human-robot collaboration. The human-safe robot combined with a agile robot gripper demonstrates collaborative part feeding and part holding in assembly tasks.

Human-robot handover

Author  Wesley P. Chan, Chris A. Parker, H.F.Machiel Van der Loos, Elizabeth A. Croft

Video ID : 716

In this video, we present a novel controller for safe, efficient, and intuitive robot-to-human object handovers. The controller enables a robot to mimic human behavior by actively regulating the applied grip force according to the measured load force during a handover. We provide an implementation of the controller on a Willow Garage PR2 robot, demonstrating the feasibility of realizing our design on robots with basic sensor/actuator capabilities.

Collaborative human-focused robotics for manufacturing

Author  CHARM Project Consortium

Video ID : 717

The CHARM project demonstrates methods for interacting with robotic assistants through developments in the perception, communication, control, and safe interaction technologies and techniques centered on supporting workers performing complex manufacturing tasks.

Dancing with Juliet

Author  Oussama Khatib, Kyong-Sok Chang, Oliver Brock, Kazuhito Yokoi, Arancha Casal, Robert Holmberg

Video ID : 820

This video presents experiments in human-robot interaction using the Stanford Mobile Manipulator platforms. Each platform consists of a Puma 560 manipulator mounted on a holonomic mobile base. The experiments shown in this video are the results of the implementation of various methodologies developed for establishing the basic autonomous capabilities needed for robot operations in human environments. The integration of mobility and manipulation is based on a task-oriented control strategy which provides the user with two basic control primitives: end-effector task control and platform self-posture control.

A cobot in automobile assembly

Author  Prasad Akella, Nidamaluri Nagesh, Witaya Wannasuphoprasit, J. Edward Colgate, Michael Peshkin

Video ID : 821

Collaborative robots - cobots - are a new class of robotic devices for direct physical interaction with a human operator in a shared workspace. Cobots implement software-defined "virtual surfaces" which can guide human and payload motion. A joint project of General Motors and Northwestern University has brought an alpha prototype cobot into an industrial environment. This cobot guides the removal of an automobile door from a newly painted body prior to assembly. Because of tight tolerances and curved parts, the task requires a specific escape trajectory to prevent collision of the door with the body. The cobot's virtual surfaces provide physical guidance during the critical "escape" phase, while sharing control with the human operator during other task phases. (Video Proceedings of the Int. Conf. on Robotics and Automation, 1999)