It’s been suggested that an advance party of robots will be needed if humans are ever to settle on other planets. Sent ahead to create conditions favourable for humankind, these robots will need to be tough, adaptable and recyclable if they’re to survive within the inhospitable cosmic climates that await them.
Collaborating with roboticists and computer scientists, my team and I have been working on just such a set of robots. Produced via 3D printer – and assembled autonomously – the robots we’re creating continually evolve in order to rapidly optimise for the conditions they find themselves in.
Our work represents the latest progress towards the kind of autonomous robot ecosystems that could help build humanity’s future homes, far away from Earth and far away from human oversight.
Robots rising
Robots have come a long way since our first clumsy forays into artificial movement many decades ago. Today, companies such as Boston Dynamics produce ultra-efficient robots which load trucks, build pallets, and move boxes around factories, undertaking tasks you might think only humans could perform.
Despite these advances, designing robots to work in unknown or inhospitable environments – like exoplanets or deep ocean trenches – still poses a considerable challenge for scientists and engineers. Out in the cosmos, what shape and size should the ideal robot be? Should it crawl or walk? What tools will it need to manipulate its environment – and how will it survive extremes of pressure, temperature and chemical corrosion?
An impossible brainteaser for humans, nature has already solved this problem. Darwinian evolution has resulted in millions of species that are perfectly adapted to their environment. Although biological evolution takes millions of years, artificial evolution – modelling evolutionary processes inside a computer – can take place in hours, or even minutes. Computer scientists have been harnessing its power for decades, resulting in gas nozzles to satellite antennas that are ideally suited to their function, for instance.
Read more:
How we built a robot that can evolve – and why it won’t take over the world
But current artificial evolution of moving, physical objects still requires a great deal of human oversight, requiring a tight feedback loop between robot and human. If artificial evolution is to design a useful robot for exoplanetary exploration, we’ll need to remove the human from the loop. In essence, evolved robot designs must manufacture, assemble and test themselves autonomously – untethered from human oversight.
Unnatural selection
Any evolved robots will need to be capable of sensing their environment and have diverse means of moving – for example using wheels, jointed legs or even mixtures of the two. And to address the inevitable reality gap that occurs when transferring a design from software to hardware, it is also desirable for at least some evolution to take place in hardware – within an ecosystem of robots that evolve in real time and real space.
The Autonomous Robot Evolution (ARE) project addresses exactly this, bringing together scientists and engineers from four universities in an ambitious four-year project to develop this radical new technology.
As depicted above, robots will be “born” through the use of 3D manufacturing. We use a new kind of hybrid hardware-software evolutionary architecture for design. That means that every physical robot has a digital clone. Physical robots are performance-tested in real-world environments, while their digital clones enter a software programme, where they undergo rapid simulated evolution. This hybrid system introduces a novel type of evolution: new generations can be produced from a union of the most successful traits from a virtual “mother” and a physical “father”.
As well as being rendered in our simulator, “child” robots produced via our hybrid evolution are also 3D-printed and introduced into a real-world, creche-like environment. The most successful individuals within this physical training centre make their “genetic code” available for reproduction and for the improvement of future generations, while less “fit” robots can simply be hoisted away and recycled into new ones as part of an ongoing evolutionary cycle.
Two years into the project, significant advances have been made. From a scientific perspective, we have designed new artificial evolutionary algorithms that have produced a diverse set of robots that drive or crawl, and can learn to navigate through complex mazes. These algorithms evolve both the body-plan and brain of the robot.
The brain contains a controller that determines how the robot moves, interpreting sensory information from the environment and translating this into motor controls. Once the robot is built, a learning algorithm quickly refines the child brain to account for any potential mismatch between its new body and its inherited brain.
From an engineering perspective, we have designed the “RoboFab” to fully automate manufacturing. This robotic arm attaches wires, sensors and other “organs” chosen by evolution to the robot’s 3D-printed chassis. We designed these components to facilitate swift assembly, giving the RoboFab access to a big toolbox of robot limbs and organs.
Waste disposal
The first major use case we plan to address is deploying this technology to design robots to undertake clean-up of legacy waste in a nuclear reactor – like that seen in the TV miniseries Chernobyl. Using humans for this task is both dangerous and expensive, and necessary robotic solutions remain to be developed.
Looking forward, the long-term vision is to develop the technology sufficiently to enable the evolution of entire autonomous robotic ecosystems that live and work for long periods in challenging and dynamic environments without the need for direct human oversight.
In this radical new paradigm, robots are conceived and born, rather than designed and manufactured. Such robots will fundamentally change the concept of machines, showcasing a new breed that can change their form and behaviour over time – just like us.
The ARE project is led from the University of York, partnering with Edinburgh Napier University, Bristol Robotics Laboratory and Vrije Universiteit Amsterdam. It is funded by the EPSRC under grant agreements EP/R03561X, EP/R035733, EP/R035679, and by the Vrije Universiteit Amsterdam.