Adaptive Offloading for Pervasive Computing AmiN Saremi 7/6/2005
Introduction Pervasive Computing Challenge: run complex applications on resource-constrained mobile device such as PDA. Solutions rewrite applications according to the resource capacity of each mobile device application-based or system-based adaptations Adaptive Offloading
Decision Making Problems for Adaptive Offloading The offloading inference engine should trigger offloading at the right time and offload the right program objects to achieve low offloading overhead and efficient program execution. adaptive offloading triggering efficient application partitioning
Solution Overview Our assumptions the application is written in an object-oriented languages the user’s environment contains powerful surrogates and plentiful wireless bandwidth
Offloading inference engine does not require any prior knowledge about an application’s execution pattern or the runtime environment’s resource status offloading inference engine employs the Fuzzy Control model as the basis for the offloading triggering inference module selects an effective application partitioning from many possible partition plans Memory constraint or CPU speed
Distributed Offloading Platform application execution monitoring Application execution graph Each graph node represents a Java class memory size, AccessFreq, Location, IsNative Each graph edge represents the interactions between the objects of two classes InteractionFreq, BandwidthRequirement
Candidate Partition Plan Generation Resource Monitoring mobile device, the surrogate, and the wireless network available memory in the Java heap, wireless bandwidth and delay Candidate Partition Plan Generation
Surrogate Discovery Transparent RPC Platform
Adaptive Offloading Inference Engine Overhead of offloading transferring objects between the mobile device and the surrogate performing remote data accesses and function invocations over a wireless network Offloading Triggering Inference examines the current resource and the available resources Decides whether offloading should be triggered decides what level of resource utilization
simple threshold-based approach “if the current amount of free memory on the mobile device is less than 20% of its total memory, then trigger offloading and offload enough program objects to free up at least 40% of the mobile device’s memory.” Fuzzy Control model linguistic decision-making rules provided by system or application developers membership functions generic fuzzy inference engine based on fuzzy logic theory
offloading memory size Offloading rules if (AvailMem is low) and (AvailBW is high) then NewMemSize := low; if (AvailMem is low) and (AvailBW is moderate) then NewMemSize := average; if (AvailMem is high) and (AvailBW is low) then NewMemSize := high; If any of these rules is matched by the current system conditions, the offloading inference engine triggers offloading offloading memory size current memory consumption - new memory utilization
Mappings between numerical and linguistic values for each linguistic variable
Application Partition Selection considering the target memory utilization on the mobile device multiple offloading requirements minimizing wireless bandwidth overhead minimizing average response time minimizing total execution time For each neighbor node Vk of Vi B i,k to denote the total amount of data traffic transferred between Vi and Vk, F i,k to define a total interaction number, MS k to represent the memory size of Vk.
For cost metrics Ck and Cl : Ck >=Cl if and only if Splitting Large Classes
Trace-Driven Simulation Experiments For comparison algorithm least recently used (LRU) Split Class Fuzzy Trigger our approach
References Xiaohui Gu, Alan Messer, Ira Greenberg, Dejan Milojicic, Klara Nahrstedt, “Adaptive Offloading for Pervasive Computing”, IEEE Pervasive Computing Magazine 2004. X. Gu, K. Nahrstedt, A. Messer, I. Greenberg, and D. Milojicic, “Adaptive Offloading Inference for Delivering Applications in Pervasive Computing Environment”, Proc. of IEEE International Conference on Pervasive Computing and Communications (PerCom 2003), Dallas-Fort Worth, Texas, March 2003.