Yearly Archives: 2011

Making Sense of Noises

As the saying goes, “garbage in, garbage out.” Although sensor fusion can mitigate many aspects of sensor fragmentation found in various smartphone platforms, there is a minimum requirement needed to achieve a level of performance consummate with the use-cases of interest. The two classes of use-cases normally considered for inertial sensors on handheld devices are user interaction and pedestrian navigation. The former includes normal user interface to access phone, email, and browser functions as well as scenarios involving motion or augmented reality games. The latter refers to using inertial sensors to augment other location services, like Wi-Fi and cell tower tri-lateration, to determine indoor positions. The bandwidth of user movement ranges from near DC up to frequencies reaching 15Hz with … Continue reading

Sensing Subsystem: Sensor Hubs, Smart Sensors and Application Processors

Modern mobile devices have up to 16 different ways to sense their environments. Count them: 4 inertial sensors (accelerometer, magnetometer, gyroscopes, barometer), 3 microphones, 2 cameras, 1 light sensor, 1 proximity sensor, 1 touch sensor, and 4 radios (GPS, WiFi, Bluetooth, NFC) that can be used to infer position. The number of featured sensors is continuously increasing and the underlying architecture is still evolving. The simplest sensor subsystems simply connect sensors to the application processor. This arrangement has the benefit of being the lowest cost but it requires designers to have good control over the architecture of system software. We have previously shown sampling a sensor at 100 Hz from the Java layer of Android results in so much time … Continue reading

Getting More Reliable Sampling in Android using Native Methods

There are three methods to access sensors in Android: Java Activity using a SensorEventListener: the standard method to access sensors. NativeActivity using a ASensorEventQueue: provides a way to bypass the Java overhead, but requires activity registered in Manifest and utilization of android_native_app_glue. ALooper using ASensorEventQueue: little known method to add sensor sampling to any thread at the native level. Java sensor sampling is easiest to implement but has large sampling time stamp uncertainties as covered in a previous blog entry. It also can take over 50% CPU usage on a 1GHz processor to sample 9-axes of sensors at 100Hz. This becomes prohibitive for heavy sensor usage. In contrast, native sensor sampling on the same device takes less than 10% CPU … Continue reading

System Architecture for Sensors Needs Better Standards

In an earlier article, we discussed that sensors and sensor subsystem architecture can be a major source of fragmentation that would continue to frustrate app developers for smart phone and mobile devices. As the industry embarks on creating a new class of situation aware mobile devices, it is key to establish and improve sensor system standards. For example, the Khronos Group through its StreamInput working group has identified “system-wide sensor synchronization for advanced multi-sensor applications” as an area in which standards is lacking. Indeed, for an application to, say, use inertial sensors to track camera angles it is necessary for the accelerometer, magnetometer, and image sensors to share the same timing reference. Each sensor, today, runs on its own free-running clock … Continue reading

Which Sensors in Android gets Direct Input? What are Virtual Sensors?

We receive many recurring questions from Android developers. This is a series of articles to help with clarification.   Developers often ask for clarification with respect to the list of sensor types list in Android documentation (see below). The list can be confusing because it includes both physical sensors and sensor types with values derived from physical sensors, sometimes these are called virtual sensors. Physical sensors include the accelerometer, gyroscope, light sensor, magnet field sensor (often called magnetometer), pressure sensor, proximity sensor, and temperature sensor. The values from these sensors are provided by hardware components directly measuring changes in the physical property of their environment. The quality of data from these sensor types depends fundamentally on the accuracy, resolution, inherent … Continue reading

Tracking Position Indoors: moving from Hype to Reality

While dead reckoning solutions have been deployed for many first responder applications, allowing pedestrians to find their location indoors remains the elusive Holy Grail for location based mobile services. However, progress along many fronts suggests that a solution may be at hand in the near future. Pedestrian maps are becoming more useful. Not long ago, mobile mapping applications like Google Maps in pedestrian mode often gives the same directions as in vehicle mode except it would ignore one-way traffic restrictions. Now, pedestrian mode direction will take the user to overpasses and underpasses to cross a street. Of course, indoor maps are still emerging and a standard way to handle maps for multi-storied buildings remains lacking. However, proprietary and open source … Continue reading

Using Sensors to Understand User Contexts

Market analysts now project that five billion MEMS sensors will be shipped in 2016 to support applications like navigation, dead reckoning, image stabilization, and augmented reality in smart phones, e-readers, tablets and gaming platforms. Although these applications are all extremely useful, we think they represent only a fraction of the functions sensors will perform. After all, most consumers don’t need directions, take pictures, or play games more than a few hours a day. But sensors, and intelligent algorithms, will be working all the time to help applications understand user contexts. Today, smart phones and tablets use sensors to understand user context in a few primitive ways. Turn the tablet from portrait to landscape orientation and the content of the display … Continue reading

Understanding Smart Phone Sensor Performance: Magnetometer

One of the most common questions we hear from mobile applications developers is, “how good are the sensors on my phone?” This article is part of a series that provides a framework to understand sensor performance. This series has previously touched upon the importance of system architecture and intelligent algorithms in providing optimal sensor performance in a smartphone or a tablet. To complete the discussion of the sensor system, platform designers also need to select good sensor components. This article uses the magnetometer to highlight the impact of these three factors. The magnetometer is commonly found on mobile devices such as smart phones and tablets, but it is one of the most difficult sensors to interpret. It is commonly called … Continue reading

Beyond Sensors: Algorithms and Heuristics

Since the debut of HAL 9000 in 1968, there have been countless fictional computers in the movies that achieved sentience. An important component of being sentient is the ability to sense one’s surroundings; and in that regard, electronic devices have seen significant progress. Take the modern smart phone: it can listen through its microphone(s) and see through its camera(s). It can respond to motion, react to ambient light and temperature, and reply to a human’s touch. Some even have the super-human abilities to hear ultrasound, sense magnetic field, measure atmospheric pressure and translate one language to another. Yet, no one but a Hollywood writer would expect us to interact with our phone or computer as if it is an intelligent … Continue reading

Smartphone Performance Fundamentals: Sensor Sampling

One of the most common questions we hear from mobile applications developers is, “how good are the sensors on my phone?” This article is part of a series that provides a framework to understand sensor performance. In the last blog article, we discussed methods to interpret the physical state of a mobile device based on sensor measurements. Most algorithms fundamentally rely on regular simultaneous sampling of the sensor data. For example, navigation-grade inertial measurement units often employ 100Hz sampling of all 10 axes. This creates a real-time data flow requirement for sampled sensor data. Many smartphone system architectures that do not sufficiently account for sensor data flow can degrade performance. Common problems are: Sensor interrupts are handled by the application … Continue reading