The second generation iPad that is speculated to launch next year may not feature something as revolutionary as this, but from what we see, a gesture based navigation system – like Kinect – could definitely be coming somewhere down the line. Earlier this month, a number of patents in the area of gesture-controlling were acquired by Apple that signalled that such a feature could be coming to iDevices somewhere down the line.
Now, Elliptic Labs, a Norway based company is learned to be demoing such a technology for the iPad at the Consumer Electronics Show scheduled for early next month. Now, it must be noted that Elliptic Labs is organizing this demo independently without any apparent partnership with Apple. However, considering that Apple has already expressed interest in the area, the demo will give us an idea of where the iPad is headed to in the future.
Here is a video preview of the technology developed by Elliptic Labs.
If you need further confirmation for the upcoming Microsoft Pink phones, it is probably this. In a patent application filed by Redmond this week, the company has elaborated on a gesture recognition module that will enable users to control parameters using gestures instead of the conventional taps on a touch screen phone. The inventors explain
“One drawback with such devices (touchscreen) is that they are difficult to interact with when the user cannot, or prefers not to, visually examine the screen. For example, when a user is exercising, riding a subway train, etc., the user may find it inconvenient or undesirable to look at the screen for extended periods of time. This may result in input errors by the user, or cause the user to look at the screen during at an undesirable time, generally frustrating the user experience.”
The patent application describes that by letting users to toggle between a touchscreen mode and gesture recognition mode, this problem can be solved
“In the relative gesture recognition mode, the graphical user interface elements in at least a defined region of the graphical user interface are made to be unselectable. The computer program may further include a gesture-based control module configured, in the relative gesture recognition mode, to recognize a contact point on the touch screen sensor between a digit of a user and a surface of the touch screen sensor in the defined region in which the graphical user interface elements are unselectable, and to present in the defined region a gesture control proximate to the contact point. The gesture-based control module may further be configured to identify a detected gesture based on user touch input originating from the contact point, and to send a message to an application program to adjust an operation of the portable electronic device based on the detected gesture.”
It is said that this kind of input can be used to adjust parameters like volume without having to use the touchscreen.