Blog

Context Awareness with Inertial Sensors

The introduction of Gimbal™, a context aware mobile platform and SDK recently introduced by Qualcomm, heralds more intuitive communications between consumers and their smart mobile devices. No longer must users tediously program in all their preferences and rules. Now, mobile devices will be able to track and learn the habits of their users, understand their interest, and adapt to their actions.

Gimbal currently features location awareness through GPS-based geo-fencing (a virtual electronic perimeter), as well as electronic information such as purchasing history, to profile users’ interests. But that is just a starting point. There are many more pieces of information already available in a smartphone that can and will be leveraged by these devices.

For example, using geo-fencing a store could push a discount coupon to targeted consumers whenever they are within a two-minute radius. And by looking at the inertial sensors on customers’ smartphones that measure their walking and standing patterns, that store can also determine the average time that customers had to wait in line at the cash register.

And not only could a consumer scan a coupon with a camera in order to rent a movie, but his smartphone could also know to mute ringtones when the consumer is sitting down to watch that movie. Furthermore, a truly intelligent smartphone would mute its ringtone not just based on the start time of the movie, but after confirming that the user indeed has his phone on his person so that he could feel its vibrations.

Until technical advances bring the nominal battery life of smartphones to significantly over 24 hours, deployment of context awareness needs to be very power conscious. Inertial sensors like accelerometers, magnetometers and pressure sensors are power efficient and could be used to throttle higher powered peripherals to reduce overall system power. For example, sensors can determine that a user has not moved beyond a 10-foot radius from an initial position and so there is no need to waste battery life to get a new GPS fix or refresh the Wi-Fi profile for a mobile device.

Sensor Platforms is contributing to these and other user contexts by working to interpret sensor data to understand users’ activities, their environments and even their intents. As the industry develops a rich set of user contexts, we can look forward to device interfaces that adapt to users instead of the other way around.

Hidden gems

On July 11th 2014, Audience completed the acquisition of Sensor Platforms. Audience believes the combination of its Advanced Voice and Multisensory Processing with Sensor Platforms’ technology places the combined company in a unique position to deliver compelling solutions based on the fusion of voice and motion.

Multisensory user interface with context awareness will bring fresh differentiation and end-user value to the market for smartphones, wearables and other mobile devices. Sensor Platforms has developed key technologies and software infrastructure in this space, and the combination of our engineering teams will enable us to rapidly scale our capabilities in context awareness-based user interface.

Audience welcomes the Sensor Platforms team and thanks all of its partners for their continued support during this transition.