View Chapter

Chapter 76 — Evolutionary Robotics

Stefano Nolfi, Josh Bongard, Phil Husbands and Dario Floreano

Evolutionary Robotics is a method for automatically generating artificial brains and morphologies of autonomous robots. This approach is useful both for investigating the design space of robotic applications and for testing scientific hypotheses of biological mechanisms and processes. In this chapter we provide an overview of methods and results of Evolutionary Robotics with robots of different shapes, dimensions, and operation features. We consider both simulated and physical robots with special consideration to the transfer between the two worlds.

Visual navigation of mobile robot with pan-tilt camera

Author  Dario Floreano

Video ID : 36

A mobile robot with a pan-tilt camera is asked to to navigate in a square arena with low walls and located in an office.

Visual navigation with collision avoidance

Author  Dario Floreano

Video ID : 37

Evolved Khepera displaying vision-based collision avoidance. A network of spiking neurons is evolved to drive the vision-based robot in the arena. A llight below the rotating contacts enables continuous evolution, even overnight.

Coevolved predator and prey robots

Author  Dario Floreano

Video ID : 38

Coevolved predator and prey robots engaged in a tournament. The predator and prey robot (from left to right) are placed in an arena surrounded by walls and are allowed to interact for several trials starting at different, randomly-generated orientations. Predators are selected on the basis of the percentage of trials in which they are able to catch (i.e., to touch) the prey, and prey on the basis of the percentage of trials in which they were able to escape (i.e., to not be touched by) predators. Predators have a vision system, whereas the prey have only short-range distance sensors, but can go twice as fast as the predator. Collision between robots is detected by a conductive belt at the base of the robots.

Evolution of collision-free navigation

Author  Dario Floreano

Video ID : 39

In their initial generations, robots can hardly avoid walls (one robot even approaches objects). After 50 generations, robots can navigate around the looping maze without hitting the walls.

Online learning to adapt to fast environmental variations

Author  Dario Floreano

Video ID : 40

A mobile robot Khepera, equipped with a vision module, can gain fitness points by staying on the gray area only when the light is on. The light is normally off, but it can be switched on if the robot passes over the black area positioned on the other side of the arena. The robot can detect ambient light and wall color, but not the color of the floor.

iCub language comprehension

Author  Stefano Nolfi, Tomassino Ferrauto

Video ID : 41

iCub robots executing imperative commands. iCub robots, provided with a camera, proprio and tactile sensors, react to imperative sentences such as "touch the yellow object" or "grasp the red object" by executing the corresponding behaviors. Robots evolved for the ability to understand and execute a sub-set of all the sentences that can be generated by combining three action words (reach, touch, and grasp) and three object words (red, yellow, and blue) display an ability also to comprehend and execute sentences never experienced before.

Resilent machines through continuous self-modeling

Author  Josh Bongard, Victor Zykov, Hod Lipson

Video ID : 114

This video demonstrates a typical experiment with a resilent machine.

A swarm-bot of eight robots displaying coordinated motion

Author  Stefano Nolfi, Gianluca Baldassarre, Vito Trianni, Francesco Mondada, Marco Dorigo

Video ID : 115

Each robot is provided with an independent neural controller which determines the desired speed of the two wheels on the basis of the traction force caused by the movements of the other robots. The evolved robots are able to display coordinated-motion capability, independent from the way in which they are assembled, as well as to coordinate in carrying heavy objects.

Discrimination of objects through sensory-motor coordination

Author  Stefano Nolfi

Video ID : 116

A Khepera robot provided with infrared sensors is evolved for the ability to find and remain close to a cylindrical object randomly located in the environment. The discrimination of the two types of objects (walls and cylinders) is realized by exploiting the limit-cycle oscillatory behavio,r which is produced by the robot near the cylinder and which emerges from the robot/environmental interactions (i.e., by the interplay between the way in which the robot react to sensory stimuli and the perceptual consequences of the robot actions).

Evolution of cooperative and communicative behaviors

Author  Stefano Nolfi, Joachim De Greeff

Video ID : 117

A group of two e-puck robots are evolved for the capacity to reach and to move back and forth between the two circular areas. The robots are provided with infrared sensors, a camera with which they can perceive the relative position of the other robot, a microphone with which they can sense the sound-signal produced by the other robot, two motors which set the desired speed of the two wheels, and a speaker to emit sound signals. The evolved robots coordinate and cooperate on the basis of an evolved communication system which includes several implicit and explicit signals constituted, respectively, by the relative positions assumed by the robots in the environment as perceived through the robots' cameras and by the sounds with varying frequencies emitted and perceived by the robots through the robots' speakers and microphones.