MIT researchers showed off a pretty impressive gestural control system that lets you manipulate on-screen images using hand gestures, you can find more info over here.
The iPhone’s familiar touch screen display uses capacitive sensing, where the proximity of a finger disrupts the electrical connection between sensors in the screen. A competing approach, which uses embedded optical sensors to track the movement of the user’s fingers, is just now coming to market. But researchers at MIT’s Media Lab have already figured out how to use such sensors to turn displays into giant lensless cameras. On Dec. 19 at Siggraph Asia — a recent spinoff of Siggraph, the premier graphics research conference — the MIT team is presenting the first application of its work, a display that lets users manipulate on-screen images using hand gestures.