The proposed technology would also allow cameras or other visual sensors to intelligently recognize the user's intentions, such as through the user's gaze at a particular object onscreen, the profile of their hands relative to the display or the mood expressed by the user's face. A user's frustration could tell a computer that it's misinterpreting a command, Apple says.
Additional information could also come from more advanced data generated by the user's grip, including modifying a touched object based on the angle and motion of the device itself, the size of fingers and hands, and biological data such as body temperature or heart rate.
Apple is broad in describing the possibilities for these systems and says they could apply to cellphones, computers and portable media players among other devices. However, it explains specifically that the camera needed for the visual aspects of this system could be the iSight camera in one of its MacBook computers and provides an illustration suggesting that an iMac or a stand-alone display with a camera could also be used.
The inventors are listed as Wayne Westerman and John Elias, the founders of pioneering multi-touch firm FingerWorks that was ultimately acquired by Apple and has been the source of several Apple multi-touch patents.
Latest Apple news; Product reviews for iPods, iPhone, iTunes, Mac programms.
No comments:
Post a Comment