Context Improves Activity Monitoring

A major part of the quantified self movement is the use of activity monitors such as FitBit, Nike FuelBand, or Jawbone UP. These devices utilize a low-power accelerometer to determine activity level and calorie count based on detected motion. Sometimes a barometer is used to determine vertical movement such as walking up or down stairs.

There are multiple ways these monitors can be faked-out, leading to an inaccurate result. Some examples include:

  • false steps: for example, swinging the device in hand can register false steps,
  • non-user motion: for example bumps in a car ride are often registered as user activity,
  • anomalous pressure changes: pressure often changes for reasons other than vertical displacement, for example entering an air conditioned building from outside.

These misidentifications happen with algorithms that only treat motion instantaneously. However, activity is not an instantaneous event. No one would go from a swim stroke to a tennis swing from one second to the next. Instead, a contextual understanding of the activity is more appropriate.

Our FreeMotion Library utilizes multiple aspects of the sensor data to build a consistent context history. One way this can be used is to improve activity identification. First, user context can be matched with the instantaneous signal. For example, if the user is sitting or is in a car when a step is detected, then it can be discarded as a false step. Second, knowing where the device is located on the user (in hand, in pocket, or on arm) allows a tailored activity monitoring algorithm to be built for each case. Overall, the inclusion of user context allows for a more accurate determination of user activity, which would be welcomed by those striving to better quantify themselves.

On July 11th 2014, Audience completed the acquisition of Sensor Platforms. Audience believes the combination of its Advanced Voice and Multisensory Processing with Sensor Platforms’ technology places the combined company in a unique position to deliver compelling solutions based on the fusion of voice and motion.

Multisensory user interface with context awareness will bring fresh differentiation and end-user value to the market for smartphones, wearables and other mobile devices. Sensor Platforms has developed key technologies and software infrastructure in this space, and the combination of our engineering teams will enable us to rapidly scale our capabilities in context awareness-based user interface.

Audience welcomes the Sensor Platforms team and thanks all of its partners for their continued support during this transition.