So You Think You Can Walk

Indoor positioning using pedestrian dead reckoning (PDR) has received much academic and commercial interest over the years. Existing sensor-based solutions track position by summing the distance travelled in individual steps, rotated by the direction of travel (called Bearing). Many methods expect users to keep the mobile sensing device stationary with respect to their body at all times, as if walking while balancing a piece of cake: the “cakewalk.” But indoor positioning on a smartphone needs to allow for natural movement, providing reasonable results independent of how the phone is carried.

Sensor Platforms advantage in PDR comes from combining traditional methods with the context results from our FreeMotion library. Context is trained from machine learned sensor signals and provides a new dimension to the solution. For example:

  • Walking context is used to mitigate false steps which come from extraneous motions,
  • Carry context is used to provide constraints on the user-to-device attitude, as well as determine the best signatures for steps in different Carry states for greater accuracy, and
  • Rotation context determines whether a detected rotation is user-only, user-to-device only, or a combination.

Even with the most sophisticated algorithms, PDR is a dead reckoning algorithm which accumulates errors over time. Therefore it will ultimately work best when combined with a beaconing system to periodically remove built-up drift. This adds the further requirement that the PDR algorithm maintains a reliable error model for communication with these other systems. The Sensor Platforms error model determines position uncertainty through a combination of normal Bayesian filtering techniques and incorporation of unmodelled situations, which are detectable with our context algorithms.

On July 11th 2014, Audience completed the acquisition of Sensor Platforms. Audience believes the combination of its Advanced Voice and Multisensory Processing with Sensor Platforms’ technology places the combined company in a unique position to deliver compelling solutions based on the fusion of voice and motion.

Multisensory user interface with context awareness will bring fresh differentiation and end-user value to the market for smartphones, wearables and other mobile devices. Sensor Platforms has developed key technologies and software infrastructure in this space, and the combination of our engineering teams will enable us to rapidly scale our capabilities in context awareness-based user interface.

Audience welcomes the Sensor Platforms team and thanks all of its partners for their continued support during this transition.