At the event Google I/O 2016 they announced the new Google Awareness APIs, enabling your apps to intelligently react to user context using snapshots and fences with minimal impact on system resources. The Awareness API exposes 7 different types of context, including location, weather, user activity, and nearby beacons, enabling your app to refine the user experience in new ways that weren’t possible before. Your app can combine these context signals to make inferences about the user’s current situation, and use this information to provide customized experiences.
The Awareness API consists of two distinct APIs which your app can use to get context signals to determine the user’s current situation:
- Fence API lets your app react to the user’s current situation, and provides notification when a combination of context conditions are met. For example, “tell me whenever the user is walking and their headphones are plugged in”. Once a fence is registered, the Fence API can send callbacks to your app even when it’s not running.
- Snapshot API lets your app request information about the user’s current context. For example, “give me the user’s current location and the current weather conditions”.
The phone can tell the applications where the user is, what they’re doing and what’s around them, which developers can use the information to build more assistive and context aware application that can help users in their day to day lives.