6 December 2010
Imagine a robot in your home doing the dishes, while safely interacting with you and reacting naturally to your mood. Imagine a gardener robot that can charge itself by converting compost or waste water into electricity. Imagine touching and feeling a simulated liver for medical diagnosis. Sounds like science fiction? Well, according to scientists at the Bristol Robotics Laboratory (BRL), this is where we are heading.
Professor Chris Melhuish with robot Bert
This open-minded and interdisciplinary attitude is also reflected in the architecture of the lab, which is divided into open workspaces, separated by glass walls. In one corner a robot arm is moving up and down, not far away a robot rat is being put through its paces, and in another corner the robot Bert is practising speech. For Adam Spiers, one of Melhuish’s PhD students, the lab is a great place to work because ‘through its openness you can always get inspiration, in case you run out of ideas’.
One successful example of this interdisciplinary approach is the SCRATCHbot – a robot rat. Working with neuroscientists, BRL researchers have created a neural architecture in silicon based on a rat’s brain, in order to control this robot. Just like a rat, it seeks out and identifies objects using its whiskers. In the next step researchers will develop a new sensor, which will then be used in an autonomous, shrew-like, whiskered robot that will enable it to track fast-moving objects, the idea behind this being to gain more insight into the brain. Researchers hope that by designing such innovative artificial touch technologies for robots they will be able to understand how the brain controls the movement of the sensory systems.
The robot rat SCRATCHbot identifies objects through its whiskers
Without having to think about it, humans naturally use the most effective and energy-saving way of carrying out an action; thus a research team at BRL developed control mechanisms for human-like movements of a robot’s arm. ‘In industry, robots are programmed to go to a certain point. Our robot also has to do the same, but it will do so in a human-like way and without programming. It is able to think through the position of its hand,’ explains Adam Spiers.
The link between humans and robots is another important aspect of BRL’s research. How can humans and robots work together in an easy, comfortable and, importantly, safe way? Here the emphasis is on cognitive models providing ‘behavioural safety’. This means that if a robot is aware of its environment and can learn from it, the interaction can take part at a safer level because the robot platform is capable of controlled intelligent movements and its actions are largely predictable and understandable by humans.
Safe human–robot interaction is also the starting-point of the project CHRIS (Cooperative Human Robot Interactive Systems), which is funded by the European Commission. If a human and a robot perform co-operative tasks in a co-located space, such as in the kitchen, how can this be made more secure in terms of verbal and non-verbal communication, perception and understanding of intention? In other words, how can a common goal such as robots and humans cooking something together be reached?
‘When humans interact with others a lot of non-verbal as well as verbal communication is constantly happening: facial expression, body position, gestures, tone of voice and goal- sharing, as well as understanding and following instructions. We interpret these constantly, but unless they are missing, you never even think about them,’ explains Melhuish. Future robots will thus need a higher level of sophistication to meet these demands. This can be achieved through engineering the robot and its ‘thinking’ (cognition) so that it can perform physical tasks which involve real-world interactions.
‘Although humans find these interactions very simple to achieve, getting a machine to do it is proving very difficult. But if we can accomplish this one day, service robots could become a part of our society,’ says Melhuish. To help achieve this, researchers at BRL also use robot heads, Jules and Eve, to explore facial expressions. These robots can copy human facial expressions. The overall goal is to build a new robot that will include many elements: gesturing, facial expression, non-language utterances like gaze, body language and, of course, speech.
EcoBot II was able to prove its self-sustainability by running continuously for 12 days on dead flies
‘Swarm robotics’ is a new approach to co-ordinating the behaviours of a large number of relatively simple robots in a decentralised manner. As the robots in the swarm have only local perception and very limited local communication abilities, one of the challenges in designing swarm robotic systems with desired collective behaviour is to understand the effect of individual behaviour on group performance. BRL researchers are exploring techniques to design and optimise the interaction rules for a group of foraging robots that try to achieve energy efficiency collectively.
Research at BRL is forging into a new, exciting and also unknown future. How long it will take to develop a robot to do the dishes is hard to say, but the team around Professor Melhuish has made enormous progress. This impressed a recent visitor, Iain Gray, CEO of the Technology Strategy Board, a government-funded organisation to promote innovation: ‘This is “real” science fiction happening on our doorstep in Bristol,’ he said, ‘for here we have world-leading research with world-leading researchers.’
For more information on the BRL, please visit www.brl.ac.uk or contact Professor Chris Melhuish on +44 (0)117 328 6334.