Microsoft's Research division frequently enters the news with projects that may or may not result in retail products. One of the newer projects is the Actuated 3D Display with Haptic Feedback, basically a 3D touchscreen with force feedback capabilities, allowing the user to feel different weights and textures of objects on the screen. The project is the creation of Mike Sinclair, Michel Pahud and Hrvoje Benko of Microsoft's Natural Interaction Research Group, and was presented at Microsoft's TechFest 2013 event in Redmond.
The system works by mounting the display - a commercial, off-the-shelf 3D monitor with multi-touch capabilities - onto a robot arm connected to the same computer. As the user pushes their finger against the screen to interact with the display, the robot arm pushes back - and can alter the strength of that feedback as required. One example used during the demonstration was an array of virtual blocks constructed from different materials: stone, wood and sponge. Each was designed to behave as realistically as possible in terms of weight and friction, with the robot arm making the user work harder to push the stone block backwards than the wood or sponge blocks.
'I had been interested in the notion of putting a robot behind something you could touch,' claims Sinclair in a blog post on the matter. 'Originally, I had wanted a robot arm with many degrees of freedom but complexity, costs, and safety issues narrowed down the options to one dimension of movement. At that point, I was sure that others must have already looked into this scenario, but after looking at the literature, it turned out no one had done this.'