Leeds researchers have been awarded £300K by EPSRC to build robots that can reason about physics like humans. How do you grasp a bottle of milk, nestling behind some yoghurt pots, within a cluttered fridge? Whilst humans are able to use visual information to plan and select such skilled actions with external objects with great ease and rapidity – a facility acquired in the history of the species and as a child develops – *robots struggle*. Indeed, whilst artificial intelligence has made great leaps in beating the best of humanity in tasks such as chess and Go, the planning and execution abilities of today’s robotic technology is trumped by the average toddler. Given the complex and unpredictable world within which we find ourselves situated, these apparently trivial tasks are the product of highly sophisticated neural computations that generalise and adapt to changing situations: continually engaging in a process of selecting between multiple goals and action options. Our aim is to investigate how such computations could be transferred to robots to enable them to manipulate objects more efficiently, in a more human-like way than is presently the case, and to be able to perform manipulation which is presently beyond the state of the art.
Investigators in this project are: Anthony Cohn (Principal Investigator), He Wang (Co-Investigator), Mark Mon-Williams (Co-Investigator), Faisal Mushtaq (Co-Investigator), Matteo Leonetti (Co-Investigator), Mehmet R Dogar (Co-Investigator)