'Minority Report' interface shown at CES

10.01.2010

A prototype system is being shown behind closed doors to reporters and industry partners at CES this week. The technology sounds futuristic, but in fact variations on it have been in the works for years, and are also being developed by competitors including Canesta of Sunnyvale, California, Optrima of Belgium, PMDTechnologies of Germany, and Mesa Imaging of Switzerland.

Canesta said in October that it had secured , from companies including laptop giant Quanta Computer, to further develop its own 3D sensor technology. Last year Canesta demonstrated a prototype gesture-controlled , and it has worked with Honda in the past on vehicle safety systems.

Most companies in the market are using a "time of flight" technology, which works by emitting an infrared pulse from a camera above the screen and measuring the time it takes to bounce back from objects in the room. This allows the systems to calculate the distance of each surface and create a virtual 3D model of the room. Any changes, like hand movements, are then translated onto the screen.

PrimeSense uses a variation of this. Instead of calculating the time for light to bounce off of objects, it encodes patterns in the light and builds a 3D image by examining the distortion created in those patterns by objects in the room, Berenson said.

He claimed this system is faster and more accurate than time-of-flight systems, and can operate in near darkness. The technology can map out objects that are up to 18 feet (six meters) away, though six to seven feet is best for applications where the user is standing up, and 10 to 12 feet is the "sweet spot" for using hand gestures on the couch, he said.