The latest chatter is that just ahead of Apple’s (AAPL)launch it that it will have Hapnatic touch.
What does that mean? It means when you touch the screen, you can feel the icon.
A company called Senseg was commissioned by to bring a special kind of “textured” touchscreen technology to the iPad3. It uses haptic feedback to make the screen feel like a variety of different surfaces.
For example, if an app displays sand, the screen will feel rough. If it displays silk, it’ll feel smooth.
Senseg’s technology was on a display at the Mobile World Congress in Barcelona, and many were impressed.
Here is more detail on how Hapnatic touch works from Apple patent compiled by Macrumors.
Haptic Tactile Feedback
Perhaps most interesting amongst the patent applications is the acknowledgement by Apple that despite the many advantages of the iPhone’s multi-touch screen, a lack of tactile feedback remains its biggest disadvantage:
However, one of a touchscreen’s biggest advantages (i.e., the ability to utilize the same physical space for different functions) is also one of a touchscreen’s biggest disadvantages. When the user is unable to view the display (because the user is occupied with other tasks), the user can only feel the smooth hard surface of the touchscreen, regardless of the shape, size and location of the virtual buttons and/or other display elements. This makes it difficult for users to find icons, hyperlinks, textboxes or other user-selectable input elements that are being displayed, if any are even being displayed, without looking at the display.
Unless touch input components are improved, users that, for example, drive a motor vehicle, may avoid devices that have a touch input component and favor those that have a plurality of physical input components (e.g., buttons, wheels, etc.).
The proposed solution is the adoption of “haptic” display technologies which allow for some tactile feedback from touch screen displays. Apple proposes including a grid of piezoelectronic actuators that can be activated on command. By fluctuating the frequency of these actuators, the user will “feel” different surfaces as their finger moves across it. As an example, a display could include a virtual click wheel which vibrates at a different frequency as the center. Users could easily sense the difference and use the click wheel without having to look at it.
Haptic technology has started gaining adoption in other mobile phones and there had been some talk that Apple might have been looking to adopt it.
Fingerprint Identification as an Input Method
A second very intriguing patent application suggests the detection of a user’s individual fingerprints as an input method. Fingerprints have already been used in computers for security purposes, but Apple’s research involves the use of fingerprint patterns to actually identify distinct fingers. This could then be used to produce specific functions depending on which finger is being used. As shown in the table below, an index finger press might perform one action (PLAY/STOP) while a middle finger press could fast forward.
The reason for such a distinction again falls back on non-visual usage. Instead of requiring the user to find a button on the touchscreen, the use of different fingers alone could trigger different commands.
Finally, the last notable application covers the dual use of a touch screen as an RFID reader. RFID tags are small circuits that can be embedded in objects for identification using a special reader. Apple suggests that the an RFID antenna can be placed in the touch sensor panel itself, allowing it to also be used as a RFID reader. As RFID tags become more prevalent, this could add a very useful function to future touch screen devices.