Context is More Than Gestures

Gestures and context are getting a lot of attention in the sensor industry recently, and although they are related, there are important distinctions.

  • Gestures are defined as a form of non-verbal communication based on an action or movement. They are instantaneous and self-contained, for example a hand-wave.
  • Context is defined as the set of circumstances surrounding a particular event or situation. It takes advantage of historical information not always described by gestures. For example distinguishing that the hand-wave is someone waving goodbye at a train station.

Although motion sensors can be used to identify both gestures and contexts, the techniques needed are different. Gesture algorithms often use sensor fusion to match a 3d trajectory or a deterministic pattern. Use of gestures such as “shake to undo” on the iPhone can lead to a poor user experience. Learning these artificial gestures are ad-hoc and false positives are frustrating to a user.

A context aware platform takes in more of the situation to better understand user motion in a natural way. The foundation is a good mechanism to encompass the multitude of variations in signals which does not rely on a user learning prescribed gestures. Still, gestures can assist in indicating a change of context. For example, standing up from a chair is a type of natural gesture and could point to the Posture context of standing or walking. Taking a phone out of a pocket is another natural gesture and could point to the Carry context of being held in hand or placed on a table.

Our FreeMotion Library incorporates a very power-efficient architecture to determine the underlying context. By utilizing low-power sensors and efficient algorithms, we are enabling always-on mobile platforms which will better understand the user’s intent.

On July 11th 2014, Audience completed the acquisition of Sensor Platforms. Audience believes the combination of its Advanced Voice and Multisensory Processing with Sensor Platforms’ technology places the combined company in a unique position to deliver compelling solutions based on the fusion of voice and motion.

Multisensory user interface with context awareness will bring fresh differentiation and end-user value to the market for smartphones, wearables and other mobile devices. Sensor Platforms has developed key technologies and software infrastructure in this space, and the combination of our engineering teams will enable us to rapidly scale our capabilities in context awareness-based user interface.

Audience welcomes the Sensor Platforms team and thanks all of its partners for their continued support during this transition.