Logo: University of Southern California

Rehab Haptics


May 24, 2005 —
A 50-year-old female stroke survivor, wearing stereoscopic goggles, clutches the handle of a stylus like a pen and twists her wrist to move a small ball through a maze-like tube on her computer screen. She feels resistance in her hand, and sees the ball turn colors, when she bumps against the side of the tube.

A researcher holds a robotic force-feed device, called a PHANToM, to move balls around in a virtual environment without bumping into other booby traps on the screen.
To avoid another collision, she tightens her grasp on the force-feedback stylus, carefully rotates her right wrist, and pushes the ball through the tube. A menu bar at the top of the screen displays her scores, revealing how much time and force was exerted to complete the task. 
 
Another stroke survivor, an 80-year-old man, holds a ball the size of an apricot in his impaired left hand, which he rotates in this virtual environment in order to place one set of blocks on top of another set.  He doesn’t “feel” the blocks touch each other, but his wrist movements are being tracked and precisely measured. The measurements will help his physical therapist gauge the difficulty level of his next therapy session.
 

Stroke patients like these, who face months of tedious rehabilitation to recover function in affected limbs, are benefiting from the rise of haptics technologies – interfaces that add the sense of touch to virtual computer environments. Like computer games, virtual therapeutic environments can be vastly more entertaining than traditional rehabilitation and custom-designed to target specific motor skills in each patient, like grasping, squeezing, pushing and rotating the wrist.


Younbo Jung, left, reaches out to touch objects in a virtual environment he sees in his goggles, while fellow researcher Shih-Ching Yeh, right, steadies the cable connection. 


Pinch, Grasp, Stroke and Squeeze   

You’ll find most of the fun stuff going on in a high-tech laboratory inside the USC Viterbi School of Engineering’s Integrated Media Systems Center (IMSC).  There, an interdisciplinary team of researchers from engineering and the USC Annenberg School for Communication are collaborating with researchers at the Keck School of Medicine of USC to develop a variety of new haptics devices to let stroke patients get downright pushy with their rehab.  
 
In fact, the patients, who are all in neurorehabilitation at USC University Hospital, will soon be able to push, grasp, pinch, squeeze and throw objects in some very novel new virtual reality tasks. 
 
“Haptics, which adds the sense of touch to 3D computing, lets stroke patients interact with virtual worlds by feel,” says Margaret McLaughlin, professor of communication at the USC Annenberg School for Communication and a co-editor of Touch in Virtual Environments.
 
Commercial gaming was one of the first industries to debut inexpensive, non-immersive versions of the technology, using force-feedback joysticks and steering wheels that vibrated as the driver sped along a video racetrack.  But in university laboratories, the availability of more sensitive, high-end devices that could render touch sensations in three dimensions quickly led to applications in more serious pursuits.
 
“The applications for the blind and visually impaired were readily apparent, and soon we saw haptics technology in medical and surgical training programs, flight school, teleoperations and scientific visualization,” McLaughlin says.
 
McLaughlin’s colleague, Albert “Skip” Rizzo, a research scientist at USC’s Institute for Creative Technologies, equates the touchy-feely virtual environments that are possible now to sophisticated aircraft simulators. 
 

“It’s very much like creating an aircraft simulator to test and train pilots,” he says.  “But now we’ve created simulations that can assess and rehabilitate a stroke patient under a range of stimulus conditions.  These are conditions that aren’t easily deliverable or controllable in the real world.”

NIH Grant

In 2004, IMSC researchers and their counterparts in the psychology department at the University of Texas, Austin, were awarded a $1.8 million grant from the National Institutes of Health to begin collaborative work on a variety of new haptics interfaces with researchers at the Keck School of Medicine of USC. 

This therapeutic environment utilizes a “cyber grasp” exoskeleton, which fits over an instrumented data glove that measures the position and orientation of the hand in a 3-D space.
 
NIH awarded the grant “to explore new directions in neurorehabilitation,“ according to principal investigator Thomas McNeill, professor of cell and neurobiology, neurology and neurogerontology at the Keck School.
 
“The need was there,” he said.  “More than 700,000 people suffer a stroke each year and nearly 450,000 survive with some form of neurologic impairment or disability.” 
 
Those numbers will grow, as the population ages and obesity and heart disease increase, making innovative rehabilitation programs “a national priority” in the next 50 years, McNeill predicts.  
 
So a group of talented faculty and Ph.D. students working in IMSC’s Haptics and Virtual Environments Lab — including McLaughlin , Rizzo, Younbo Jung, Wei Peng, Shih-Ching Yeh and Weirong Zhu — dove into new applications.

They came up with an interesting assortment, including the “pincher.”  This device is designed for two-fingertip contact with virtual objects, say Zhu and Yeh, who write computer programs for the systems.
 

The game (or cyber task) works like this:  The user dons a pair of stereoscopic goggles and puts a thimble on the forefinger; the thimble is connected to a robotic force-feed device, called a PHANToM.  The stylus of a second PHANToM is affixed to the thumb. The two PHANToMs provide the sensation of force to the user’s fingertips as she/he tries to pick up a 3D cube and squeeze it small enough to fit through a narrow hole on the computer screen.


A stroke survivor in rehabilitation holds a ball in his impaired hand and rotates his wrist to move objects around on the screen. His wrist movements are tracked and precisely measured.
Mutual Touch
Another interface is the “mutual touch” task for hand-reaching and grasping exercises. This therapeutic environment utilizes a “cyber grasp” exoskeleton, which fits over an instrumented data glove that measures the position and orientation of the hand in a three-dimensional space.

 
“The glove allows patients to feel the sensation of a solid object in their palms,” says Yeh, who concentrates more on the computer graphics.  “Among the tasks they might be able to perform are picking up a glass and inverting it to pour the liquid out or picking up books and stacking them on appropriate shelves.”
 
The interfaces give physical therapists precise control over a stroke patient’s exercise program, which is key to recovery, McLaughlin adds.
 
“We can tailor rehabilitative tasks, like pouring milk out of a glass, to each patient, depending on what level of impairment they have sustained,” she says.  “We also get information on their performance instantly, which helps the therapist to design a rehabilitative program of increasing difficulty. “
 

In clinical pilot tests at the USC Keck School — led by Carolee Winstein, professor of biokinesiology and physical therapy and co-principal investigator on the NIH grant, and Ph.D. student Jill Stewart — stroke patients have reported their “overall satisfaction” with the new computer tasks.  In one instance, a volunteer was “extremely enthusiastic” about the space tube task and said she wanted to use the system at home.  

 
That is critical to post-stroke recovery.  “It’s not easy to keep patients motivated and engaged in daily, repetitive exercises,” McLaughlin says, “so if they are enjoying the tasks, they’re likely to do better during rehabilitation.” 
 

The space tube challenges users to move the ball into the entrance of the tube, indicated by the arrow, and push it through without banging into the walls.  
Telerehabilitation
“Telerehabilitation” is another interface currently in development.  This interface will be web-based, allowing both the therapist and the patient to log in to a private site, where goals for the patient’s recovery phase can be set.
 
Patients will be able to select goals based on their individual lifestyles and send them to the therapist for review, the researchers say.  The therapist will be able to discuss those goals with the patient via a peer-to-peer audio conferencing feature embedded in the web page. 
 

“The interface will allow patients to take the initiative in setting their own goals for recovery, and then allow for further discussion and negotiation of those goals later,” McLaughlin says.  “It’s a way of keeping the patient connected to the therapist and of encouraging patients to set their own goals and become more responsible for their own recovery.”

 
--Diane Ainsworth