TItle: Always-On Context Inference on Mobile Devices Exploiting Heterogeneous Processors.
Description: Smartphones and body-worn sensors enable continuous inference of various context information of users such as activity, location, psychosocial status, and the environment. As more applications exploit such user context information, we claim that the context information should be provided by middleware in order to achieve better software engineering and management of constrained resource of smartphones. We propose flow-based programming architecture, FlowEngine, for efficient and flexible computation of context inferences. FlowEngine is efficient because duplicate computations are shared among different context inferences. It is also flexible because new computation can be dynamically added on runtime and off-loading computations to microcontrollers or cloud services are well-supported. Using FlowEngine, applications can simply subscribe to relevant services to receive updates on context changes, or they can poll the system for the most recent updates. As many applications subscribing to similar contexts, the system optimizes data processing so no duplicate computation can occur. In addition, as phone battery level changes, the system can gracefully degrades the quality of context inferences to meet users’ phone lifetime goals.
Status: Inactive Project
Main Research Area: Pervasive Computing