Juan Fasola and the PR2 robot
You may tell your future domestic assistive robot to “go to the kitchen and get a bottle of water from the table,” but how will it decide what that means and how best to execute this request?
The robot needs to understand the grammar of such a sentence—that “go” means robot movement, that “to” is a preposition that indicates a path somewhere, and that “the kitchen” is a noun phrase that is referring to a specific space.
How does it figure all that out, and once more, correctly execute the requested action? Because people like Juan Fasola are programming it.
Fasola, 29, is a graduate student in the lab of USC Viterbi School of Engineering computer science Professor Maja Matarić, an internationally recognized socially assistive robotics expert. Matarić is the Soon-Shiong professor of computer science, neuroscience and pediatrics, founding director of the USC Robotics and Autonomous Systems Center, co-director of the USC Robotics Research Lab, as well as the vice dean for research at the USC Viterbi School of Engineering.
Fasola recently won the CoTeSys Cognitive Robotics Best Paper Award at the IEEE/RSJ International Conference on Intelligent Robots. The paper, entitled “Using Semantic Fields to Model Dynamic Spatial Relations in a Robot Architecture for Natural Language Instruction of Service Robots,” outlines Fasola’s biology-inspired framework for interpreting commands that a robot can then execute.
The simulation interface Fasola used to test his framework.
Among the most notable aspects of his paper is the way his framework allows for constraints, such as “go to the kitchen but walk along the wall.”
Bandit, a socially-assistive robot
To successfully fulfill such a constraint, the robot needs to know precisely what “along” means.
“I had to look at different literature in linguistics and cognitive psychology to get at the root of what ‘along’ refers to,” said Fasola. For his program, he specified that “along” means proximity and a path that travels parallel to the reference object. This translates to numerical calculations the robot conducts in order to achieve the task of going to the kitchen while satisfying the user’s along-the-wall constraint.
"The key to the framework," said Fasola, "is that the meaning of spatial prepositions can be represented within the robot by computational models called spatial semantic fields, which can then be used by the robot to follow user commands appropriately."
“The robot can combine the execution of the path of getting to the goal but also with the constraint, which is how you execute the task. The robot can combine them very easily using my framework. That was one of the really cool things about the paper and probably the most novel.”
The paper contains results from simulation tests, but Fasola has since started running experiments with robots like the PR2 and Bandit. In the 3-D simulation, the program would plan and execute paths, but with full knowledge of the placement of obstacles in the environment. In the lab setting, sometimes a robot comes across an unforeseen obstacle that isn’t on the internal map it has of the space. In the following video, a misplaced trashcan prompts the PR2 robot to do some obstacle clearing maneuvers to check to see that it could fit through the hallway. To check for clearance, the PR2 spins in place twice. “He did a few little twirls and then left,” Fasola said, chuckling.
Fasola used Bandit in the past for his work in socially assistive robots for the elderly, and he plans to use Bandit again to test this framework with the same target users. He plans to run a pilot study in spring 2014 at a senior living facility. “Hopefully they can give it natural language speech commands and the robot will understand it.”