Logo: University of Southern California

Robots Get a Feel for the World at USC Viterbi

Robots equipped with tactile sensor better able to identify materials through touch than humans, enabling more lifelike prosthetics

June 18, 2012 —

What does a robot feel when it touches something? Little or nothing until now. But with the right sensors, actuators and software, robots can be given the sense of feel - or at least the ability to identify materials by touch.

Researchers at the University of Southern California’s Viterbi School of Engineering published a study today in Frontiers in

A robot hand equipped with SynTouch's BioTac sensors.

 

Neurorobotics showing that a specially designed robot can outperform humans in identifying a wide range of natural materials according to their textures, paving the way for advancements in prostheses, personal assistive robots and consumer product testing.

The robot was equipped with a new type of tactile sensor built to mimic the human fingertip. It also used a newly designed algorithm to make decisions about how to explore the outside world by imitating human strategies. Capable of other human sensations, the sensor can also tell where and in which direction forces are applied to the fingertip and even the thermal properties of an object being touched.

Like the human finger, the group’s BioTac® sensor has a soft, flexible skin over a liquid filling. The skin even has fingerprints on its surface, greatly enhancing its sensitivity to vibration. As the finger slides over a textured surface, the skin vibrates in characteristic ways. These vibrations are detected by a hydrophone inside the bone-like core of the finger. The human finger uses similar vibrations to identify textures, but the BioTac is even more sensitive.

When humans try to identify an object by touch, they use a wide range of exploratory movements based on their prior experience with similar objects. A famous theorem by 18th century mathematician Thomas Bayes describes how decisions might be made from the information obtained during these movements. Until now, however, there was no way to decide which exploratory movement to make next. The article, authored by Professor of Biomedical Engineering Gerald Loeb and recently graduated doctoral student Jeremy Fishel, describes their new theorem for this general problem as “Bayesian Exploration.”

Built by Fishel, the specialized robot was trained on 117 common materials gathered from fabric, stationery and hardware stores. When confronted with one material at random, the robot could correctly identify the material 95% of the time, after intelligently selecting and making an average of five exploratory movements. It was only rarely confused by a pair of similar textures that human subjects making their own exploratory movements could not distinguish at all.

So, is touch another task that humans will outsource to robots? Fishel and Loeb point out that while their robot is very good at identifying which textures are similar to each other, it has no way to tell what textures people will prefer. Instead, they say this robot touch technology could be used in human prostheses or to assist companies who employ experts to judge the feel of consumer products and even human skin.

Robots Get A Feel For The World from USC Viterbi on Vimeo.

Loeb and Fishel are partners in SynTouch LLC, which develops and manufactures tactile sensors for mechatronic systems that mimic the human hand. Founded in 2008 by researchers from USC’s Medical Device Development Facility, the start-up is now selling their BioTac sensors to other researchers and manufacturers of industrial robots and prosthetic hands.

Another paper from this research group in the same issue of Frontiers in Neurorobotics describes the use of their BioTac sensor to identify the hardness of materials like rubber.

Original funding for development of the sensor was provided by the Keck Futures Initiative of the National Academy of Sciences to develop a better prosthetic hand for amputees. SynTouch also received a grant from the National Institutes of Health to integrate BioTac sensors with such prostheses. The texture discrimination project was funded by the U.S. Defense Advanced Research Projects Agency (DARPA) and the material hardness study by the National Science Foundation.

Fishel just completed his doctoral dissertation in biomedical engineering based on the texture research. Loeb, also Director of the USC Medical Device Development Facility, holds 54 U.S. Patents and has published over 200 journal articles on topics ranging from cochlear implants for the deaf to fundamental studies of muscles and nerves.