Market analysts now project that five billion MEMS sensors will be shipped in 2016 to support applications like navigation, dead reckoning, image stabilization, and augmented reality in smart phones, e-readers, tablets and gaming platforms. Although these applications are all extremely useful, we think they represent only a fraction of the functions sensors will perform. After all, most consumers don’t need directions, take pictures, or play games more than a few hours a day. But sensors, and intelligent algorithms, will be working all the time to help applications understand user contexts.
Today, smart phones and tablets use sensors to understand user context in a few primitive ways. Turn the tablet from portrait to landscape orientation and the content of the display reorganizes to try and fit the display. Bring the smart phone to your ear and the touch screen turns off (OK, many phones still needs to work on that one). But with more sophisticated algorithms and heuristics, the sensors can do much more.
How about a smart sensor system that knows when you are getting in or getting out of a car? For starters, users can send all incoming calls, except those coming from their families, to voicemail while they are driving. Then there are those “car finder” apps today that can bring a driver back to his car if he starts the app after he has parked. We have been suspicious of the utility of such an app: if we had the presence of mind to start an app when we left the car, we would probably be able remember where we has left it without any navigation aid. So it would be more useful if smart sensors automatically trigger the navigation system to remember the location where we got out of our car for those absent minded moments we all have.
A new smart phone now contains two cameras, an accelerometer, a magnetometer, a gyroscope, a proximity sensor, a light sensor, and two or more microphones. These sensors capture a huge amount of data that can be used to inform and entertain consumers. At the same time, these data also capture the reality surrounding the users. Applications running on the phone can process the data and mine for information that help them adjust their configurations automatically to better match where the users are and what they are doing.
Concerns for privacy notwithstanding, consumers do look forward to smart phones that can become truly smart, mind-reading, assistants. The first step towards that is having smart phones that can automatically infer user context. Kenneth Noland, the America abstract painter said it well, “context is the key – from that comes the understanding of everything.”