View Chapter

Chapter 34 — Visual Servoing

François Chaumette, Seth Hutchinson and Peter Corke

This chapter introduces visual servo control, using computer vision data in the servo loop to control the motion of a robot. We first describe the basic techniques that are by now well established in the field. We give a general overview of the formulation of the visual servo control problem, and describe the two archetypal visual servo control schemes: image-based and pose-based visual servo control. We then discuss performance and stability issues that pertain to these two schemes, motivating advanced techniques. Of the many advanced techniques that have been developed, we discuss 2.5-D, hybrid, partitioned, and switched approaches. Having covered a variety of control schemes, we deal with target tracking and controlling motion directly in the joint space and extensions to under-actuated ground and aerial robots. We conclude by describing applications of visual servoing in robotics.

IBVS on a 6- DOF robot arm (3)

Author  Francois Chaumette, Seth Hutchinson, Peter Corke

Video ID : 61

This video shows an IBVS on a 6-DOF robot arm with Cartesian coordinates of image points as visual features and mean interaction matrix in the control scheme. It corresponds to the results depicted in Figure 34.4.

Chapter 41 — Active Manipulation for Perception

Anna Petrovskaya and Kaijen Hsiao

This chapter covers perceptual methods in which manipulation is an integral part of perception. These methods face special challenges due to data sparsity and high costs of sensing actions. However, they can also succeed where other perceptual methods fail, for example, in poor-visibility conditions or for learning the physical properties of a scene.

The chapter focuses on specialized methods that have been developed for object localization, inference, planning, recognition, and modeling in activemanipulation approaches.We concludewith a discussion of real-life applications and directions for future research.

Touch-based, door-handle localization and manipulation

Author  Anna Petrovskaya

Video ID : 723

The harmonic arm robot localizes the door handle by touching it. 3-DOF localization is performed in this video. Once the localization is complete, the robot is able to grasp and manipulate the handle. The mobile platform is teleoperated, whereas the robotic arm motions are autonomous. A 2-D model of the door and handle was constructed from hand measurements for this experiment.

Chapter 35 — Multisensor Data Fusion

Hugh Durrant-Whyte and Thomas C. Henderson

Multisensor data fusion is the process of combining observations from a number of different sensors to provide a robust and complete description of an environment or process of interest. Data fusion finds wide application in many areas of robotics such as object recognition, environment mapping, and localization.

This chapter has three parts: methods, architectures, and applications. Most current data fusion methods employ probabilistic descriptions of observations and processes and use Bayes’ rule to combine this information. This chapter surveys the main probabilistic modeling and fusion techniques including grid-based models, Kalman filtering, and sequential Monte Carlo techniques. This chapter also briefly reviews a number of nonprobabilistic data fusion methods. Data fusion systems are often complex combinations of sensor devices, processing, and fusion algorithms. This chapter provides an overview of key principles in data fusion architectures from both a hardware and algorithmic viewpoint. The applications of data fusion are pervasive in robotics and underly the core problem of sensing, estimation, and perception. We highlight two example applications that bring out these features. The first describes a navigation or self-tracking application for an autonomous vehicle. The second describes an application in mapping and environment modeling.

The essential algorithmic tools of data fusion are reasonably well established. However, the development and use of these tools in realistic robotics applications is still developing.

AnnieWay

Author  Thomas C. Henderson

Video ID : 132

This is a video showing the multisensor autonomous vehicle merging into traffic.

Chapter 53 — Multiple Mobile Robot Systems

Lynne E. Parker, Daniela Rus and Gaurav S. Sukhatme

Within the context of multiple mobile, and networked robot systems, this chapter explores the current state of the art. After a brief introduction, we first examine architectures for multirobot cooperation, exploring the alternative approaches that have been developed. Next, we explore communications issues and their impact on multirobot teams in Sect. 53.3, followed by a discussion of networked mobile robots in Sect. 53.4. Following this we discuss swarm robot systems in Sect. 53.5 and modular robot systems in Sect. 53.6. While swarm and modular systems typically assume large numbers of homogeneous robots, other types of multirobot systems include heterogeneous robots. We therefore next discuss heterogeneity in cooperative robot teams in Sect. 53.7. Once robot teams allow for individual heterogeneity, issues of task allocation become important; Sect. 53.8 therefore discusses common approaches to task allocation. Section 53.9 discusses the challenges of multirobot learning, and some representative approaches. We outline some of the typical application domains which serve as test beds for multirobot systems research in Sect. 53.10. Finally, we conclude in Sect. 53.11 with some summary remarks and suggestions for further reading.

A day in the life of a Kiva robot

Author  Mick Mountz

Video ID : 210

Kiva Systems founder and CEO Mick Mountz narrates a play-by-play video of how Kiva robots automate a warehouse environment. http://www.kivasystems.com/

Chapter 59 — Robotics in Mining

Joshua A. Marshall, Adrian Bonchis, Eduardo Nebot and Steven Scheding

This chapter presents an overview of the state of the art in mining robotics, from surface to underground applications, and beyond. Mining is the practice of extracting resources for utilitarian purposes. Today, the international business of mining is a heavily mechanized industry that exploits the use of large diesel and electric equipment. These machines must operate in harsh, dynamic, and uncertain environments such as, for example, in the high arctic, in extreme desert climates, and in deep underground tunnel networks where it can be very hot and humid. Applications of robotics in mining are broad and include robotic dozing, excavation, and haulage, robotic mapping and surveying, as well as robotic drilling and explosives handling. This chapter describes how many of these applications involve unique technical challenges for field roboticists. However, there are compelling reasons to advance the discipline of mining robotics, which include not only a desire on the part of miners to improve productivity, safety, and lower costs, but also out of a need to meet product demands by accessing orebodies situated in increasingly challenging conditions.

Autonomous loading of fragmented rock

Author  Joshua Marshall

Video ID : 718

This video shows autonomous loading of fragmented rock, first on a 1-t capacity Kubota loader at Kingston, Canada, followed by an implementation on a 14-t capacity Atlas Copco ST14 LHD in an underground mine at Kvarntorp, Sweden. The algorithm used in these demonstrations is based on force-feedback sensed in the loader cylinder pressures and utilizes an admittance control structure.

Chapter 61 — Robot Surveillance and Security

Wendell H. Chun and Nikolaos Papanikolopoulos

This chapter introduces the foundation for surveillance and security robots for multiple military and civilian applications. The key environmental domains are mobile robots for ground, aerial, surface water, and underwater applications. Surveillance literallymeans to watch fromabove,while surveillance robots are used to monitor the behavior, activities, and other changing information that are gathered for the general purpose of managing, directing, or protecting one’s assets or position. In a practical sense, the term surveillance is taken to mean the act of observation from a distance, and security robots are commonly used to protect and safeguard a location, some valuable assets, or personal against danger, damage, loss, and crime. Surveillance is a proactive operation,while security robots are a defensive operation. The construction of each type of robot is similar in nature with amobility component, sensor payload, communication system, and an operator control station.

After introducing the major robot components, this chapter focuses on the various applications. More specifically, Sect. 61.3 discusses the enabling technologies of mobile robot navigation, various payload sensors used for surveillance or security applications, target detection and tracking algorithms, and the operator’s robot control console for human–machine interface (HMI). Section 61.4 presents selected research activities relevant to surveillance and security, including automatic data processing of the payload sensors, automaticmonitoring of human activities, facial recognition, and collaborative automatic target recognition (ATR). Finally, Sect. 61.5 discusses future directions in robot surveillance and security, giving some conclusions and followed by references.

Tracking people for security

Author  Nikos Papanikolopoulos

Video ID : 683

Tracking of people in crowded scenes is challenging because people occlude each other as they walk around. The latest revision of the University of Minnesota's person tracker uses adaptive appearance models that explicitly account for the probability that a person may be partially occluded. All potentially occluding targets are tracked jointly, and the most likely visibility order is estimated (so we know the probability that person A is occluding person B). Target-size adaptation is performed using calibration information about the camera, and the reported target positions are made in real-world coordinates.

Chapter 76 — Evolutionary Robotics

Stefano Nolfi, Josh Bongard, Phil Husbands and Dario Floreano

Evolutionary Robotics is a method for automatically generating artificial brains and morphologies of autonomous robots. This approach is useful both for investigating the design space of robotic applications and for testing scientific hypotheses of biological mechanisms and processes. In this chapter we provide an overview of methods and results of Evolutionary Robotics with robots of different shapes, dimensions, and operation features. We consider both simulated and physical robots with special consideration to the transfer between the two worlds.

Evolved bipedal walking

Author  Phil Husbands

Video ID : 374

The video shows stages of evolution of bipedal walking in a simulated, bipedal robot using realistic physics (from the work by Torsten Reil and originating at Sussex University). This was the first example of successfully- evolved bipedal gaits produced in a physics-engine-based simulation. The problem is inherently dynamically unstable, thus making it an interesting challenge.

Chapter 43 — Telerobotics

Günter Niemeyer, Carsten Preusche, Stefano Stramigioli and Dongjun Lee

In this chapter we present an overview of the field of telerobotics with a focus on control aspects. To acknowledge some of the earliest contributions and motivations the field has provided to robotics in general, we begin with a brief historical perspective and discuss some of the challenging applications. Then, after introducing and classifying the various system architectures and control strategies, we emphasize bilateral control and force feedback. This particular area has seen intense research work in the pursuit of telepresence. We also examine some of the emerging efforts, extending telerobotic concepts to unconventional systems and applications. Finally,we suggest some further reading for a closer engagement with the field.

Passivity of IPC strategy at 30-Hz sample rate

Author  Stefano Stramigioli

Video ID : 724

In this short video, the effectiveness of the passive sampling approach and IPC control are shown. A "PD" like control is implemented digitally in the classical way and also using IPC and passive sampling. At the used sampling frequency of 30 Hz, it is shown that instability occurs for the standard implementation, but is completely absent in the proposed way.

Chapter 13 — Behavior-Based Systems

François Michaud and Monica Nicolescu

Nature is filled with examples of autonomous creatures capable of dealing with the diversity, unpredictability, and rapidly changing conditions of the real world. Such creatures must make decisions and take actions based on incomplete perception, time constraints, limited knowledge about the world, cognition, reasoning and physical capabilities, in uncontrolled conditions and with very limited cues about the intent of others. Consequently, one way of evaluating intelligence is based on the creature’s ability to make the most of what it has available to handle the complexities of the real world. The main objective of this chapter is to explain behavior-based systems and their use in autonomous control problems and applications. The chapter is organized as follows. Section 13.1 overviews robot control, introducing behavior-based systems in relation to other established approaches to robot control. Section 13.2 follows by outlining the basic principles of behavior-based systems that make them distinct from other types of robot control architectures. The concept of basis behaviors, the means of modularizing behavior-based systems, is presented in Sect. 13.3. Section 13.4 describes how behaviors are used as building blocks for creating representations for use by behavior-based systems, enabling the robot to reason about the world and about itself in that world. Section 13.5 presents several different classes of learning methods for behavior-based systems, validated on single-robot and multirobot systems. Section 13.6 provides an overview of various robotics problems and application domains that have successfully been addressed or are currently being studied with behavior-based control. Finally, Sect. 13.7 concludes the chapter.

Experience-based learning of high-level task representations: Reproduction (2)

Author  Monica Nicolescu

Video ID : 31

This is a video recorded in early 2000s, showing a Pioneer robot learning to visit a number of targets in a certain order - the robot execution stage. The robot training stage is also shown in a related video in this chapter. References: 1. M. Nicolescu, M.J. Mataric: Experience-based learning of task representations from human-robot interaction, Proc. IEEE Int. Symp. Comput. Intell. Robot. Autom. , Banff (2001), pp. 463-468; 2. M. Nicolescu, M.J. Mataric: Learning and interacting in human-robot domains, IEEE Trans. Syst. Man Cybernet. A31(5), 419-430 (2001)

Chapter 40 — Mobility and Manipulation

Oliver Brock, Jaeheung Park and Marc Toussaint

Mobile manipulation requires the integration of methodologies from all aspects of robotics. Instead of tackling each aspect in isolation,mobilemanipulation research exploits their interdependence to solve challenging problems. As a result, novel views of long-standing problems emerge. In this chapter, we present these emerging views in the areas of grasping, control, motion generation, learning, and perception. All of these areas must address the shared challenges of high-dimensionality, uncertainty, and task variability. The section on grasping and manipulation describes a trend towards actively leveraging contact and physical and dynamic interactions between hand, object, and environment. Research in control addresses the challenges of appropriately coupling mobility and manipulation. The field of motion generation increasingly blurs the boundaries between control and planning, leading to task-consistent motion in high-dimensional configuration spaces, even in dynamic and partially unknown environments. A key challenge of learning formobilemanipulation consists of identifying the appropriate priors, and we survey recent learning approaches to perception, grasping, motion, and manipulation. Finally, a discussion of promising methods in perception shows how concepts and methods from navigation and active perception are applied.

Handle localization and grasping

Author  Robert Platt

Video ID : 652

The robot localizes and grasps appropriate handles on novel objects in real time.