Getting More Reliable Sampling in Android using Native Methods

There are three methods to access sensors in Android:

  • Java Activity using a SensorEventListener: the standard method to access sensors.
  • NativeActivity using a ASensorEventQueue: provides a way to bypass the Java overhead, but requires activity registered in Manifest and utilization of android_native_app_glue.
  • ALooper using ASensorEventQueue: little known method to add sensor sampling to any thread at the native level.

Java sensor sampling is easiest to implement but has large sampling time stamp uncertainties as covered in a previous blog entry.  It also can take over 50% CPU usage on a 1GHz processor to sample 9-axes of sensors at 100Hz.  This becomes prohibitive for heavy sensor usage.

In contrast, native sensor sampling on the same device takes less than 10% CPU usage to sample 9-axes of sensors at 100Hz.  Furthermore, the sampling is much more regular as shown in the figure.  Note that compared to Java-level access, virtually all the large deviations from nominal sampling are removed.  Furthermore, native-level access provides finer control over sensor sampling rate.  Rather than specifying a device-dependent SENSOR_RATE_GAME, it is possible to specifically request a sampling rate, such as 100Hz.  (Of course, whether the device can provide such sampling is still dependent on the underlying hardware.)

The Java method for sensor acquisition is well known and the NativeActivity method is provided as an example with the Android NDK.  Therefore, only the ALooper method will be discussed in some detail  here.  The ALooper method allows a shared library to access sensors without the Java side having to reference them at all.  This is how the Android version of the FreeMotion™ Library accesses sensors.

The steps to native-level access without a NativeActivity are:

  1. Identify the looper associated with the calling thread, or create one if it does not exist.  A looper is a message loop for a thread and will handle the sensor event callbacks.

    ALooper* looper = ALooper_forThread();
    if(looper == NULL)
    looper = ALooper_prepare(ALOOPER_PREPARE_ALLOW_NON_CALLBACKS);
     
  2.  As in a NativeActivity, get an instance of the Sensor Manager and each sensor

    sensorManager = ASensorManager_getInstance();
    accelerometerSensor = ASensorManager_getDefaultSensor(sensorManager, ASENSOR_TYPE_ACCELEROMETER);
     
  3. Create a sensor event queue from the sensor manager and register it with the looper.  This needs to have a callback method (e.g. get_sensor_events) which is called when an event occurs.

    sensorEventQueue =   ASensorManager_createEventQueue(sensorManager, looper,  LOOPER_ID_USER, get_sensor_events, sensor_data);
     
  4. Implement the callback method with the logic to use the sensor data.  It must have the following

    typedef int (*ALooper_callbackFunc)(int fd, int events, void* data);
     
  5. Register the sampling frequency of the sensorEventQueue.

    ASensorEventQueue_setEventRate(sensorEventQueue, accelerometerSensor,
    (1000L/SAMP_PER_SEC)*1000);
     

Regular sampling and low computing overhead are the foundations of a sophisticated sensor library.  Although this provides a much better method to access sensors, there is still room for improvement.  Sensors are built to provide regular sampling of data.  Therefore it is natural to use an interrupt-based method.  While it seems that is being done in Android, the callback is performed by the Sensor Manager (step 3) which itself uses polling to access the sensor (step 5).  To get the best sampling performance, our FreeMotion™ library replaces the libsensors.so sensor library and work with the sensor drivers directly in embedded Linux.

On July 11th 2014, Audience completed the acquisition of Sensor Platforms. Audience believes the combination of its Advanced Voice and Multisensory Processing with Sensor Platforms’ technology places the combined company in a unique position to deliver compelling solutions based on the fusion of voice and motion.

Multisensory user interface with context awareness will bring fresh differentiation and end-user value to the market for smartphones, wearables and other mobile devices. Sensor Platforms has developed key technologies and software infrastructure in this space, and the combination of our engineering teams will enable us to rapidly scale our capabilities in context awareness-based user interface.

Audience welcomes the Sensor Platforms team and thanks all of its partners for their continued support during this transition.