Know your motive: new research considers the next wave of robots
It's becoming crucial to consider that robots should be able to understand the motive of a task in the same way humans do, as this will enable machines to work more quickly and effectively, however this also indicates a significant shift in the world of robotics.
This consideration is at the heart of a new article by the National Centre for Nuclear Robotics, based at the University of Birmingham. The paper, published in Nature Machine Intelligence, looks at the issue of robots using objects.
Lead author Dr. Valerio Ortenzi, at the University of Birmingham, argues the shift in thinking will be necessary as more organisations embrace automation, connectivity and digitisation (Industry 4.0). and as such levels of human robot interaction, whether in factories or homes, is increasing dramatically.
When compared to the cognitive ability of humans, the majority of factory-based machines designed to pick up objects are straight-forward, with their only task to grasp familiar objects that appear in predetermined places at a set time.
However, enabling a machine to pick up unfamiliar objects, randomly presented, requires the interaction of multiple, complex technologies.
These include vision systems and advanced AI so the machine can see the target and determine its properties and potentially pick up the object without damaging it.
Even when all this is accomplished, researchers in the National Centre for Nuclear Robotics highlighted a fundamental issue - what has traditionally counted as a successful grasp for a robot might be a real-world failure, because the machine does not take into account what the goal is and why it is picking an object up, the article suggests.
The Nature Machine Intelligence paper cites the example of a robot in a factory picking up an object for delivery to a customer.
It successfully executes the task, holding the package securely without causing damage. Even so, the robots gripper obscures a crucial barcode, which means the object can't be tracked and the firm has no information about the location of the object. As a result, the whole delivery system is impared.
Dr Ortenzi and his co-authors give other examples, involving robots working alongside people.
One example involves a person asking a robot to pass then a screwdriver in a workshop. Based on current conventions, the best way for a robot to pick up the tool is by the handle. However, this could mean the machine, which is much stronger than the person, thrusts a potentially lethal blade towards the person at speed.
Should the robot understand its objective, instead, for instance to pass the screwdriver safely to its human colleague, it is able to take a safer course of action.
One final example is if a robot is asked to pass a glass of water to a resident in a care home. A seemingly simple task requires the robot to ensure that it doesn't drop the glass, that water doesn't spill on the recipient, and that the glass is presented in such a way that the person can take hold of it.
What is obvious to humans has to be programmed into a machine and this requires a very different approach to simple automation, the researchers say.
The traditional metrics used by researchers, over the past twenty years, to assess robotic manipulation, are not sufficient.
QUT robotics researcher and director of the Australian Centre for Robotic Vision, Professor Peter Corke, says the ability for robots to interact physically with people by handing them things they want, in a way that is comfortable and efficient, is an important step forward.
Future robots will be expected to work with humans in a natural and human-like way, Corke says.
The research was carried out in collaboration with QUT, the University of Birmingham, Scuola Superiore SantAnna, Italy, theGerman Aerospace Center(DLR), Germany, and the University of Pisa, Italy.