Logo: University of Southern California

Viterbi School Robot Hand Picks Up a Prestigious Best Paper Prize

Three young researchers from the Computational Learning and Control Lab demonstrate a superior grasp of programming gentle and accurate touch
Eric Mankin
November 22, 2011 —

A hand reaches down to a table and picks up a thin pencil or a fragile cup or a heavy box. Human hands, guided by experience, easily make the necessary adjustments. And recently a group of young Viterbi researchers won a best paper prize at a top international robot competition for teaching robots to do the same.

Pastorteam
Handlers and hand: (from left) Ludovic Righetti, Peter Pastor and Mrinal Kalakrishnan.

Peter Pastor and Mrinal Kalakrishnan are Ph.D. students who work in Professor Stefan Schaal’s Computational Learning and Motor Control Laboratory. They, Schaal and post-doc Ludovic Righetti co-authored a paper entitled “Online movement adaptation based on previous sensor experience” that was one of some 790 papers accepted — out of 2,541 submitted — at the 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2011) held in San Francisco. The paper went on to win the top prize.

Pastor explained that the grasps programmed in their robots are vastly different from the grasps robots now routinely perform in, for example, assembling automobiles, where the properties of the material the robots handle is known. In those circumstances all the movements and the force used for these movements can be precisely programmed in advance.

Righetti said that for robots working in an everyday environment the parameters are open and unknown. “There will be a table and random objects, and the robot has to go grasp them and put them in a particular location.”

“The robot looks where objects could be,” said Pastor. “It scans the table and finds a cluster of data.” Then it uses 3D visual processing to try to decode what it sees.

Still from video showing hand in action. Click on the image to view.
One cluster of data decodes as a cup, Pastor continued. “So assuming it is a cup, it has to decide on a grasp pose: how to grasp the cup – from the side, from the top, on the handles; and then how much force. And then we have motion planning algorithms. We have to compute a collision-free trajectory, so the hand doesn’t smash the table.”

Once robotic hands can negotiate the world, the possibilities are wide—from caring for the elderly to caring for pets, to performing all kinds of routine tasks. But first the robot hands have to be safe and to be safe, the paper states, they need to be able to instantly adapt their behavior to unforeseen events — to things that may look like a cup, but are something quite different.

The program has robots creating expectations about the objects they are picking up based on past experience, but they need to be able instantly to adjust their behavior if the expectations don't match what their fingers feel.

The prize-winning paper describes how the team first gave the robotic system — an off-the-shelf piece of equipment called a Barrett WAM robot arm — the ability to learn from experience with regard to specific everyday tasks, including opening a door with a lever handle or picking up a pen sitting on a desk without knocking over an adjacent flashlight.

Kalakrishnan explains, “The method was to guide the robot arm through the task while its senses took in and

Professor Stefan Schaal.
internalized the data accompanying the task at every step of the way.”

Visual data, kinesthetic data, force data and touch data were stored in a collection called “Dynamic Movement Primitives” (DMPs) with the aim of having the robot anticipate basic parameters of the action – how much force, how to move.

But these expectations also had to be testable by the robot in real time, so that if the reality differed from expectation it could react immediately, instead of blindly following the former path.

In the words of the paper, the idea was to create “a sensory feedback law to allow DMPs to realize online movement adaptation.”

The hope in the future will be to build on the success achieved to expand the repertory and flexibility of DMP knowledge and adaptation. The hand may not be able to gently pick up and stroke a kitten tomorrow – but give it time.

The National Science Foundation and the Defense Advanced Research Projects Agency program on Advanced Robotic Manipulation supported the research.

>Listen to an interview with Peter Pastor about the research on KPFK Digital Culture.