Fujitsu Laboratories has developed a next generation user interface which can accurately detect the users finger and what it is touching, creating an interactive touchscreen-like system, using objects in the real word. In other words, toss out that ancient mouse and clunky iPad and grab one of these guys! (just kidding, these aren’t on store shelves yet…but these are!).

“We think paper and many other objects could be manipulated by touching them, as with a touchscreen. This system doesn’t use any special hardware; it consists of just a device like an ordinary webcam, plus a commercial projector. Its capabilities are achieved by image processing technology.”

VR-UI1

Using this technology, information can be imported from a document as data, by selecting the necessary parts with your finger. As you can see, the device at this point isn’t the smallest piece of tech in your arsenal but it sure does pack a wallop! From the video, found here, you can see that it not only accepts touch but gesture in mid-air as well.

VR-UI2

With an interface that looks like it’s straight out of Iron Man or Minority Report, I could really get behind this. Imagine getting to the airport and viewing destinations and buying tickets by swiping through maps and zoom into cityscapes, or doctors viewing your medical record and surgery details with the polished version of this prototype.

VR-UI3

A clip of how the computer reads your hand to make decisions.

It’s the future folks! Let’s hope the nice geeks in the basements over at Apple and Google are working on something like this. Just shrink it down, make it a couple hundred bucks, and now you have me reaching for my wallet.

 

Via prosthetic knowledge