Robots can now read how-to diagrams | Science News



Support nonprofit journalism

Subscribe now

News in Brief

These robots can follow how-to diagrams

The AI can figure out what action is needed by analyzing before-and-after images

2:00pm, January 16, 2019

STEP-BY-STEP  A new robot operating system helps bots understand picture-based instructions (like this robot shown with a diagram for how the red toy boxer can defeat the blue) to perform various tasks.

Sponsor Message

Robots imbued with a certain kind of common sense may soon be able to follow instructional diagrams to build things.

When studying pictures for assembling IKEA furniture or LEGO villages, humans are naturally good at inferring how to get from A to B. Robots, on the other hand, normally have to be painstakingly programmed with exact instructions for how to move. “Even when you try to teach robots by demonstration, they’re just repeating the exact same motions you show them, not the concept underlying them,” says Dileep George, an artificial intelligence and neuroscience researcher at the San Francisco company Vicarious AI.

George and colleagues have now designed a robot operating system that can understand the basic ideas conveyed in schematic instructions and translate those ideas into action. These common sense robots, described online January 16 in Science Robotics, could work on a wider variety of tasks under different conditions than machines restricted to explicitly coded instructions or physical demonstrations.

The new robotic system learned more than 500 general concepts, such as “stack green objects on the right” and “arrange objects in a circle,” by studying before and after images for each type of action. When given a new set of instructions with a before-and-after diagram, the fully trained system considers all the concepts it has learned, and chooses and executes the maneuvers that will help it reach its goal.

George’s team tested the operating system in two gripper arm robots that moved objects across a tabletop. The robots examined picture instructions and then completed the tasks, such as separating lemons from limes and arranging different colored cans in a row. The machines also worked well when researchers changed conditions like the type of objects to be moved or the tabletop color.


M. Lázaro-Gredilla et al. Beyond imitation: Zero-shot task transfer on robots by learning concepts as cognitive programs. Science Robotics. Published online January 16, 2019. doi: 10.1126/scirobotics.aav3150.

Further Reading

M. Temming. With this new system, robots can ‘read’ your mind. Science News. Vol. 194, July 21, 2018, p. 8.

Get Science News headlines by e-mail.

More on 2016 Top 10

From the Nature Index Paid Content