The researchers at Fujitsu Laboratories have created a next generation user interface which allows the interface to detect the user’s finger as well as what the finger is touching at that time. What it does next is create an interactive touchscreen-like system using real objects from our world.

We showcased a demo last year on Seishin Blog, and now the news go mainstream.

As quoted from the video:

We think paper and many other objects could be manipulated by touching them, as with a touchscreen. This system doesn’t use any special hardware; it consists of a simple device, like an ordinary webcam, and a commercial projector. It’s capabilities are achieved through image processing technology 

Importing documents as you would import data has never been this easy. All you need to do is select the parts of the document that you want to collect using this touchscreen interface.

Basically, this touchscreen interface is able to measure any shape of a real world objects and then it adjusts the coordinate systems for the projector, camera, and real world. On top of that, objects with curved surfaces can also be coordinated just by touching.

In order to have the touchscreen interface detect touches accurately, the system would need to detect the fingertip height accurately as well. If the finger detection is off just by a single pixel, the height of the image will change by 1cm. So, to have the system detect fingertips accurately, it will need a technology that is able to do so.

A few screenshots below: 








Watch the video here:

I can’t wait to read my book and extract the best lines of my novels directly on my laptop. Nice Fujitsu Lab !