Droid Logic

Dedicated to the study of biologically inspired machines

Walking Robots

Current Projects:

Humanoid Robotics

In our current research at University of Sussex we are combining evolutionary robotics techniques with dynamic systems to build humanoid robots that walk more naturally. Our control system can actuate machines with a high degree of freedom in their joints while taking advantage of the natural dynamics within their body (passive dynamics).  The following robot was evolved to walk on rugged surfaces:

Bumpy Walker

  Walking forward (Video)

  Walking backward (Video)

  Walking by (Video)

  Walking on a bumpy surface (Video)

  Losing balance on bumpy surface (Video)

  Walking with angkles (Video)

The following videos are of a machine with ankles, hips, torso, head, and arms. 25 degrees of freedom in all.

(or MPlayer)


 Walk by (Video)

 Front view (Video)

 Back view (Video)

 Side/Rear view (Video)



This last machine is an extension of the previous in which a flexible spine and toe joint has been added increasing the degrees of freedom up to 32.

 Front view (Video)

(simulations were developed using Apple's XCode tools, OpenGL, and the Open Dynamics Engine)

Mars Rover (European Space Agency)

Combined with researchers from Surrey Space Centre, University of Bath, EADS Astrium Ltd., and the British Antarctic Survey, the University of Sussex developed the control system and simulation of six legged rover for exploring mars. As part of a case study on the feasibility of biologically inspired technologies, our research involved the evolution of an artificial nervous system. This was achieved by using evolutionary robotics techniques and a huge heterogeneous grid of Apple G5 and G4 computers. By leveraging Apple's XGrid technology we were able to parallelize the evaluation and testing of our control systems, greatly reducing the time to discover viable genotypes.

Left: rover walking on Martian surface. Right: Evolved nervous system of rover

Left: Close up of eye and front leg neurocontroller. Right: dynamically generated terrain

The agent was evolved over hundreds of generations to walk to a beacon placed randomly in dynamically generated terrain. To achieve its goal, the agent had to learn how to follow a homing signal while walking over uneven ground and avoiding obstacles. To make this possible is was given an extensive array of senses such as a set of laser based proximity sensors to scan it's environment.

Click below to see movies of the rover walking:

walking 1
walking 2
avoiding an obstacle
walking in a crater 1
walking in a crater 2
walking along the edge of a crater

The rover was very resistant to damage of all kinds. When 30% damage was applied to every aspect of its nervous system it could still reach the beacon 85% of the time. When one of its legs was broken it could adapt its gait and still reach the beacon 75% of the time.

Movies of the rover walking after it has been damaged:

broken rear forward back
broken rear lifter
broken right front knee
broken right middle knee
broken right rear leg


(or MPlayer)

Background and Previous work

Evolution of a humanoid robot

Humans demonstrate speed, efficiency, and adaptability when walking. At University of Sussex we are combining genetic algorithms, passive dynamics, neural networks, and central pattern generators to design a humanoid robot. All our robots were constructed inside a physics simulator. The control system of the robot depicted in this video was not programmed or designed. We tasked evolution to evolve a neural network over hundreds of generations that displayed a particular behavior. In our research we have been experimenting with balance, walking, and running.


In this experiment we tasked evolution to find an artificial neural network (ANN) for a 10-degree of freedom robot that would allow it to balance on a moving platform. The result was a dynamic machine that could stay on the platform even under extreme circumstances.

Click on the links to see the movie: (or MPlayer)

An interesting property of this machine is the control system will work even if the body is built incorrectly. If the legs are made shorter or the feet bigger this machine will continue to stay balanced on the platform.


By creating a simulation of the central pattern generator found in the vertebrate spine and combining it with passive dynamic walker (see section below) we were able to make our machine walk. In the following video evolution found an ANN that could direct our machine to walk on a flat surface.


Click on the links to see the movie: (or MPlayer)

The interesting thing about this machine is that it was very robust to disturbances. If pushed it could regain balance by changing its foot placement. As found with the platform balancing robot if the body is built with minor errors this machine will still continue walk.


We are now experimenting with a machine that can balance a torso while running.

Click on the links to see the movies: (or MPlayer)

To read the paper on these machines click here.

Passive Dynamic Walking

  A passive dynamic walker


People have been trying to make machines walk for a very long time. As early as 1888 people were building toys that could mechanically walk down a inclined slope with no motors. A good example is a toy solder with straight legs that marches, or a penguin that waddles. In the 1990 Tad McGeer began to try and make machines that walk more like people using knee joints. His philosophy is basically this: When the Wright brothers wanted to fly they didn't copy a bird and all its muscles. They created an un-powered glider. Once they perfected it they added power. Walking is a very similar problem. When we walk we use very little energy. We use the physics of our bodies to glide along and put in just enough energy to keep us going. So he built a machine that if placed on a sloped surface could walk its length with no motors and no control system. This is similar to the way a slinky will walk down a flight of stairs by itself. Most of McGeer's models were 2 dimensional but recently a 3D one was build at Cornell (Collins,Wisse, Ruina).


At Sussex we built a 3-dimensional passive dynamic walker in a physics simulator that walks down a four-degree slope. This machine walks unpowered because the lengths and masses of its leg segments are carefully selected. To make these selections a genetic algorithm was employed. Red boxes indicate the location and size of the masses.

Click on the links to see the movie: (or MPlayer)

This machine was built in stages by adding individual features:
  1. Spherical feet and multiple knees
  2. Ankles
  3. Hips
To read the paper on these machines click here.

Evolution of Robotic Arm control

Bilaterally symmetric segmented neural networks control the arm depicted in these videos. Each joint has an identical network the communicates locally with adjacent joints. The interesting thing here is that a collection of individual identical networks can control arms with various joint configurations. They end up working together as a swarm mind to move the hand to its target.


3-jointed arm
10-jointed tentacle

Click on the links to see the movies: (or MPlayer)

To read the paper on these machines click here.

Other projects:

Open Dynamics Engine (ODE)
Port of ODE to OSX (with extra examples)

Introductary presentation on ODE given at the University of Sussex

Sony AIBO OPEN-R Development Kit for OSX

Open Computer Vision (OpenCV) for AIBO (now maintained by AiboPet)



A culmination of 10 years of research at Honda; Asimo can shake hands and walk up stairs.

AIBOs are Sony's initial venture into robotics pets. At first glance they seem like just a toy but in reality they are a robotics platform. They are built on a technology called OPEN-R that allows developers to create new software for them.

Not wanting to be outdone by Honda's Asimo, Sony built its own humanoid called the Sony Dream Robot (SDR). Keeping with Sony's tradition of miniaturization, SDR is just under a foot tall and will be marketed as a toy. Like the Aibo it's also built on the OPEN-R platform. Check out the movies:
movie1 movie2 movie3 movie4 movie5


Humanoid Robotics
This is a link to Rodney Brooks' Humanoid Research Group. Rodney began with insect robots and then moved directly to humanoids. His current robot, Cog can recognize faces and maintain eye contact. Unlike most robots that are carefully programmed to pick up objects, Cog learned to do this task much like a human child. See the video.



One day Mark Tilden took apart several Sony walkmans and built a six legged walking robot with them. His philosophy is to make an intelligent robot, give it an intelligent body. What most people would try to do with complex software, Tilden designs with simple electronics. He invented the "Nervous Net", a collection of electronics that strives to balance itself towards a goal (similar to an artificial neural net). See the video .


MIT Leg laboratory
This is the place where MIT teaches robots to take their first steps. Their robots are very different from Asimo and SDR. Inspired by biology M2 uses passive dyanmics and simulated tendons to walk. video

This company, founded by Rodney Brooks, builds robots based on his subsumption architecture. Traditional AI focuses on a top down approach often studying high level brain functions such as logical reasoning to understand intelligence. Subsumption takes the opposite stance: Intelligence is evolved from the bottom up with simple systems including the body all working together.


Other dedicated sites:
Android World
Created by Eric Vaughan