According to an article in the New Scientist, researchers at Carnegie Mellon University (Chris Harisson) along with Microsoft’s lab (Dan Morris and Desney Tan) claim to be able to turn your skin into a touch-screen. Innovatively called ‘Skinput’ (a combination of the words ‘skin’ and ‘input’), it uses a bio-acoustic sensing array in combination with a wrist-mounted pico-projector to turn your arm into a display and input device, without any implantation
How does it work? The pico-projector (which can be mounted in other places than just your wrist) will project images onto the skin of your arm (or on any part of your body in line of sight of the projector), and as you press these image buttons, particular vibrations/sounds will ripple through your skin, muscles and bones, which will be picked up and interpreted as signals by special software in the bio-acoustic sensing array. Specific locations/images/buttons can be mapped for specific functions.
What is interesting is that the projector itself is not critical to the process, only the sensing array and software is, enabling you to ‘tap’ the chosen area without any images on it, and still cause an effect. From an observer’s perspective though, this might look quite ridiculous, with a person touching himself for no apparent reason. It also gives new meaning to the term logging off, when a touch could trigger anything from a song to a search.
This application joins the ranks of real-world digital-interfaces, along with the Microsoft Surface, Project Natal, and Pranav Mistry’sSixthSense. Recent work on Microsoft’s Surface will also soon make the device a portable one, similar to SixthSense. Pranav Mistry, whose device certainly seems to have the most potential applications as well as portability, warns that the Skinput device will have to be very precisely placed each time, so that the images and sounds are nearly identical each time, thereby limiting the functionality of the device.
U an just checkout the video for more details
Other NEWS article regarding this:
*Sixth Sense technology is coming, Microsoft demos Surface-based NUI
Aiming to bring the world to a level of ubiquitous integrated mobile computing, the Microsoft Research team have demonstrated a shrunken Mobile Surface application where a projection of a motion-touch interface can be placed on almost any flat object turning it into a responsive active display screen.
The system uses a small webcam and digital projector to project a seamlessly integrated interface where the user can respond with gestures creating interactions with the system. Similar to the concept used in the film Minority Report, shown by PrimeSense at CES this past January, this technology will be the next wave in computer manipulation.
Gesture-based interfaces have been imagined by many great minds, like the ‘Sixth Sense’ technology shown by Pranav Mistry. The video demonstration on TED India shows his technology creating a mobile interface that will integrate into many parts of your life, giving you access to information for making optimal decisions throughout your day.
Microsoft’s demo uses similar technologies, just less exciting in respects to its actual presentation. A video by TechFlash shows researchers playing a simulated drum set which is projected on a table, allowing the user to tap his fingers or drumsticks to initiate the drum sounds.
Plans to integrate this technology in future games for the Project Natal game control system for Xbox 360 have led to advancements in natural user interfaces (NUI) and is essentially the Wii of the future.
Checkout the video for more details…