By Ellie Zolfagharifard Source: The Engineer Date: 26 October 2009 Humanoid robots that can assist injured people in hazardous environments are moving closer to reality as a result of research being carried out at Edinburgh University. The three-year project is intended to combine the latest advances in mathematics and engineering to develop robotic systems that can safely handle flexible objects in extreme conditions. At the moment, this ability is beyond the reach of motion synthesis techniques because of the complex nature of the calculations required for handling objects and avoiding collisions. These methods, based on movements in the XYZ coordinates, are sensitive to any minor deformations in the environment, such as a flexible object that would cause the calculations to be invalid. The Edinburgh team has developed an alternative technique using a topological model that takes into account the position of objects in relation to one another. The system is based on Gaussâ€™s theory of linking numbers, which calculates the relationship between threads. Dr Sethu Vijayakumar, co-investigator, said: â€˜By considering the topological space using this theory, we are able to capture the invariances in the environment. â€˜Topology-based motion synthesis is a fairly radical change in concept for programming robots. Our hope is that it will lead to robots that act more like humans,â€™ he added. Along with the Honda Research Institute Europe, the team hopes to have a prototype humanoid robot that is able to dress itself by 2013. Vijayakumar believes that the research could have wider implications in creating robots that could assist casualties out of burning buildings or operate complex tasks in uncontrolled environments, such as in nuclear clean-up operations. Dr Taku Komura, principal investigator, said that, while the theory has progressed significantly, the engineering challenges could prove an obstacle. â€˜One of the biggest difficulties we foresee is in recording movements and feeding back the information to the robots,â€™ he said. â€˜Currently, weâ€™re using a range of methods including a camera-based motion system and inertial techniques that record movements using accelerometers and gyroscopes.â€™ Vijayakumar said that further research in this field would require advances in sensing technology, particularly visual sensing systems, which he believes are currently not robust enough to relay the required information to the robot.