Tec F.I.R – Pre-Touch Sensing for Mobile Interaction
Ok, so there is lot of AI (Artificial Intelligence) floating and as a lay man it is difficult for us to understand what does it do and also how it works? Or should i term it an unnecessary as we as end users should know about it works.
So, Microsoft Precognitive touch screens is all about the sense of touch. It’s like the movement you move your hand (or) finger towards the screen, the device or this program will bring the options which can be performed on that screen.
Below are the extract from Microsoft Blog.
A touch of virtual reality – Andy Wilson
Advances in virtual reality have thus far mainly been in the realm of optics, rendering and audio technologies. But improving haptics — the sense of touch — remained elusive until now.
The key objective in virtual reality is establishing a sense of presence. It’s easy to suspend disbelief if the environment looks real, but when you reach out to touch a virtual object and your hand goes through it the illusion is shattered. And the interaction itself changes the virtual scene, requiring the whole environment to be redrawn or rebuilt.
A new framework called haptic retargeting essentially “hacks” human perception and leverages the dominance of vision when our senses conflict. This allows for the development of much more complex virtual environments that can have many more virtual objects with which to interact.
Andy Wilson, a principal researcher at Microsoft, said the long-term implications of this research will be limited only to the imagination of designers and developers.
The research is detailed in Haptic Retargeting: Dynamic Repurposing of Passive Haptics for Enhanced Virtual Reality Experiences and was developed in collaboration with the University of Southern California and the University of Waterloo.
Precognitive touch screens
Imagine a mobile device that intelligently anticipates your intended action even before you touch the screen.
That’s what’s being presented in Pre-Touch Sensing for Mobile Interaction. Ken Hinckley, a principal researcher at Microsoft who led the project, said the research is based on a whole different philosophy of interaction design.
The research uses the phone’s ability to sense how you are gripping the device as well as when and where the fingers are approaching it.
“It uses the hands as a window to the mind,” Hinckley said.