A recent Microsoft patent application was published by the US Patent & Trademark Office that relates to an all-new sensory system that Microsoft calls their "Interpretation and Behavior Selection Module (IBSM)" which we first introduced in back to back patent reports in June 2012. Microsoft's latest patent introduces us to another dimension of this system which they call Multi-Touch Movement (MTM) gestures. The gestures could be preset or given different actions to perform by individual developers.
Microsoft Invents New Mobile Device Touch Gestures
In order to provide a robust user interface for mobile devices, Microsoft's states that a developer may wish to expand the number of gestures that a handheld mobile device is able to recognize. Today, a developer may find that the design space of available gestures is too limited. Hence, the developer may find it difficult to formulate a gesture that is suitably distinct from existing gestures. The developer may create an idiosyncratic and complex gesture to distinguish over existing gestures. But an end user may have trouble remembering and executing such a gesture.
Microsoft focuses on new 'Multi-Touch-Movement' Gestures
Microsoft's invention focuses on their new 'multi-touch movements (MTM) gesture that is technically defined this way:
"Receiving a touch input event from at least one touch input mechanism in response to the user making contact with a surface of the computing device; receiving a movement input event from at least one movement input mechanism in response to movement of the computing device; determining whether the touch input event and the movement input event indicate that a user has made a multi-touch-movement (MTM) gesture.
A user performs a MTM gesture by touching a surface of the touch input mechanism to establish two or more contacts, in conjunction with moving the computing device in a prescribed manner. The functionality defines an action space in response to the determining operation, where the two or more contacts demarcate the action space. The functionality may then perform an operation that affects the action space."
For example, a user may perform an MTM gesture by applying at least two fingers to a display surface of a touchscreen interface mechanism. The user may then tilt the computing device from a starting position in a telltale manner, while maintaining their fingers on the display surface of the touchscreen interface mechanism. Upon receiving input events which describes these actions, the functionality can conclude that the user has performed a MTM gesture.
For example, the functionality can define an action space that is demarcated by the user's two fingers on the display surface. The functionality can then perform any action associated with the MTM gesture, such as by selecting an object encompassed by the action space that has been demarcated by the user with his or her finger.
According to another illustrative aspect, the functionality can classify a user's gesture as a MTM gesture even though the user's fingers may have slipped on the display surface of the computing device in the course moving the computing device.
According to another illustrative aspect, the functionality can distinguish between MTM gestures and large movements performed by the user while handling the computing device for non-input-related purposes. For example, the functionality can distinguish between MTM gestures and movements produced when the user picks up and sets down the computing device.
Microsoft states that the system is designed to work with mobile devices such as smartphone, an electronic book reader device, a portable digital assistant device, a tablet-type or slate-type computing device, a portable game console device, a laptop computing device, a netbook-type computing device, and so on.
The Brains of the System: The IBSM Module
The brains behind Microsoft's new MTM gesture is the Interpretation and Behavior Selection Module (IBSM) as noted below in patent FIG. 2. The IBSM performs the task of interpreting the input events. In particular, the IBSM receives at least touch input events from the touch input mechanism and movement input events from the touch movement input mechanism. Based on these input events, the IBSM determines whether the user has made a recognizable gesture. If a gesture is detected, the IBSM executes behavior associated with that gesture.
Microsoft's patent FIG. 2 provides additional details regarding the IBSM. For instance, the gesture matching module 202 compares the input events with a collection of signatures that describe different telltale ways that a user may interact with their computing device. More specifically, a signature may provide any descriptive information which characterizes the touch input events and/or motion input events that are typically produced when a user makes a particular kind of gesture.
For example, a signature may indicate that a gesture X is characterized by a pattern of observations A, B, and C. You can see this more clearly in patent FIG. 12 below. The user can chose to create a "signature" gesture by holding down two points on the display at the same time as they add a motion or tilt such as the one illustrated in patent FIG. 5. The combination of two touch points and a movement will be interpreted by the system as a new user specific command.
The new commands could be set to turn a page, delete a file, open an app, increase or decrease volume and more. According to Microsoft, the user is free to define an action space on any portion of any surface of their computing device. In other words, the IBSM implicitly enables the user to make MTM gestures and non-MTM gestures at any location of the display.
Microsoft's introduction places some emphasis on the developer by stating that "a developer may wish to expand the number of gestures that a handheld mobile device is able to recognize." With that in mind, think of an Xbox developer. They could program the buttons for the Xbox controller to represent a game movement as they so desire. So the X-button, for example, could represent a character jumping or shooting or another action. Developers could combine buttons too. So this could be applied to mobile devices in that a developer of a specific app could apply a certain "2-point multi-touch + motion" gesture deliver a certain function like delete a file.
Below are a series of Microsoft patent figures to help you visualize some of the new gestures that you could find fully described in their patent.
To truly grasp and appreciate the vision behind Microsoft's vision relating to their Interpretation and Behavior Selection Module, the reader would have to review the entire series of reports that we've presented thus far on this subject matter, as this is a very detailed plan.
One could really appreciate the depth of planning that has gone into this project at Microsoft and yet with Windows 8 still in a slow uptake phase, these advances will likely have to trickle in or be put on hold until the time is just right. Microsoft can't afford to shock their user base with yet more confusing interface elements and/or revolutionary gestures. Though in the big picture, this system is definitely going to advance gestures in the future in creative ways.
A Note for Tech Sites covering our Report: We ask tech sites covering our report to kindly limit the use of our graphics to one image. Thanking you in advance for your cooperation.
Patent Bolt presents a detailed summary of patent applications with associated graphics for journalistic news purposes as each such patent application is revealed by the U.S. Patent & Trade Office. Readers are cautioned that the full text of any patent application should be read in its entirety for full and accurate details. Revelations found in patent applications shouldn't be interpreted as rumor or fast-tracked according to rumor timetables. About Comments: Patent Bolt reserves the right to post, dismiss or edit comments.
Another related patent IBSM report: Microsoft Bringing Fingerprint Detection to "Surface" Tablets