Users typically guide a cursor with their eyes, staring at objects for a time to emulate a mouse click. But that is too laborious to let users to match the speed and accuracy of real-time 3D games, says lead researcher on the project, Stephen Vickers, of De Montfort University, Leicester, UK.
His team is developing the software as part of the EU-funded project Communication by Gaze Interaction (COGAIN).
"Even though a user in, say, Second Life might look as if they are able-bodied, if they can't operate and communicate as fast as everyone else, they could be perceived as having a disability," he told New Scientist, adding that there is a privacy issue for players who may prefer not to reveal their disability in the virtual world.
In virtual worlds, gamers need to perform a whole suite of commands including moving their character or avatar, altering their viewpoint on the scene, manipulating objects and communicating with other players.
Eye-gaze systems bounce infrared light from LEDs at the bottom of a computer monitor and track a person's eye movements using stereo infrared cameras. This setup can calculate where on a screen the user is looking with an accuracy of about 5 mm.
Vickers' software includes the traditional point and click interface, but includes extra functions to speed up certain commands.
Eye-tracking the next big thing for computer games?
Posted on Saturday, May 10 2008 @ 17:32 CEST by Thomas De Maesschalck
New Scientists reports scientists are working on an eye-tracking interface for computer games: