June 06, 2006 —
Immersidate pioneer:Cyrus Shahabi
USC engineers are perfecting a games user testing tool that captures and analyzes play experience to automatically detect weakness and flaws — and it may even soon gauge player emotional involvement.
User testing is a critical and key element for a new game - books have been written about it. But it remains a highly subjective and quite unstructured exercise. "Traditionally," says Tim Marsh, a post-doctoral researcher at the USC Viterbi School of Engineering's Integrated Media Systems Center, "game companies hire teenagers, and turn them loose trying to find flaws and gaps in the game, which they report either verbally or in writing, along with their impressions."
This is neither systematic nor scientific says Marsh, who is co-author of a conference presentation entitled "Continuous and Unobtrusive Capture of User-Player Behavior and Experience to Assess and Inform Game Design and Development," to be given at the Fun ‘n Games 2006 Conference in England on June 26, 2006.
Marsh's method analyzes "immersidata." USC Viterbi School computer scientist Cyrus Shahabi, one of the researchers on the project, coined the term several years ago to refer to the machine-readable record of commands sent to the computer by keyboards, joysticks and other controls, collected in parallel with a videotape recording of the player at the game session.
What's wrong with this picture? A user is unsure what to do next — and immersidata analysis software can allow game developers to see what happened.
An IMSC-developed tool called "ISIS" (Immersidata AnalySIS) can "identify data of interest and index events within the videotape. For the game development application, ISIS can return indexed examples of six different kinds of occurrences, or "points" in the immersidata/video record
- Activity completion points, when the player has finished a final task associated with a mission.
- Task completion points, a subset under this, allowing a researcher to go back over the performance of a task.
- Break points, times when nothing seems to be happening; the player isn't moving and no events occur. This can be distraction, or a break, but "break is a very important concept … because it provides clues to what interrupts players.”
- Wandering points, somewhat similar times when the user-player is moving, but doesn't select any objects .
- Critical events. Some elements of the game are the hardest, and these can be pre-selected, so that action leading up to accomplishment or non-accomplishment can be studied
- Navigation errors. Collisions with a wall or object potentially point to inadequate or poor design causing user disorientation.
Zeroing in: the red dots (see enlarged version point to identified, and specifically characerized "points" in game play.
By backtracking from the points, investigators can see how the point developed. Similar patterns backing up parallel points can be clear indications of problem in the game.
Marsh and Shahabi used for their tests a "serious" (i.e., teaching) game designed to instruct students in human anatomy and physiology. The study analyzed sessions by 16 undergraduate students.
Though Marsh and the group tested the technique on a serious game, “the techniques are for use testing all game genres, entertainment and non-entertainment,” Marsh said.
The system already works extremely effectively to find problems in the areas it is set to look for, Marsh reports. Improvements are already in the works to add functionality to find and identify other potential problem areas — to recognize repetition patters by players, and to replace and/or supplement the video capture with a replay of the game from the player's point of view, for example.
Marsh is also working on ways to use immersidata and to capture more aspects of the game experience, including particularly the emotional/empathetic elements. Marsh recently wrote a chapter on this, entitled "Vicarious Experience: Staying There Connected With and Through Our Own and Other Characters" in a new book, Gaming as Culture
(McFarland Press, 2006)
In addition to Marsh and Shahabi, USC computer science doctoral candidate Kiyoung Yang played a key role on the project, Marsh said. Shamus Smith of the University of Durham also participated, and will present the paper at the conference.
The research was funded by the National Science Foundation, and by Shahabi's Presidental Early Career for
Scientists and Engineers (PECASE) grant.