Download presentation
Presentation is loading. Please wait.
Published byVeronica Boone Modified over 9 years ago
1
Loading a Cache with Query Results Laura Haas, IBM Almaden Donald Kossmann, Univ. Passau Ioana Ursu, IBM Almaden
2
2 Background & Motivation Applications invoke queries and methods Queries select relevant objects Methods work with relevant objects Example: find hotels and reserve rooms Other examples: CAX, SAP R/3, Web foreach h in (select oid from hotels h where city = Edinburgh) h.requestRoom(3, Sep-6, Sep-12);
3
3 Background and Motivation Traditional client-server systems: –methods are executed by clients with caching –queries are executed by clients and servers –query processing is independent of caching Problems: –data must be fetched twice –objects are faulted in individually Terrible performance in many environments
4
4 Traditional System server cachequery processor foreach h in (select oid from...) h.reserveRoom();
5
5 Goal & Solution Load Cache as a by-product of queries. –copy relevant objects while executing the query Cache operators do the copying Extend the query optimizer –which collections should be cached? –when to copy? Assumption: caching in the granularity of objects
6
6 HotelsCities Cache Join foreach h in (select oid from...) h.reserveRooms(); server
7
7 Tradeoffs What to cache? –Cost of Cache operator must be smaller than savings obtained by this kind of pre-caching When to cache? –late so that only relevant objects are cached –early so that other operators are not affected N.B. Cache operators affect the cost of other (lower) operators in the plan
8
8 HotelsCities Cache Join server Early vs. Late Cache Operators: Copying Irrelevant Objects
9
9 Hotels Cities Cache Join Early vs. Late Cache Operators: Late Projections Early Cache - Cheap Join Late Cache - Expensive Join Hotels Cities Join Cache
10
10 Alternative Approaches Determine candidate collections for caching; i.e. what to cache: –carry out data flow analysis –analyze select clause of the query; cache if oid is returned Determine when to cache candidate objects: –heuristics –cost-based approach
11
11 Caching at the Top Heuristics Policy –cache all candidate collections –cache no irrelevant objects (i.e., late caching) Algorithm –generate query plan for select * query –place Cache operator at the top of plan –push down Cache operator through non- reductive operations N.B.: Simulates „external“ approach
12
Cache Operator Push Down Cache Operator may be pushed down non-reductive operations Cache(h,c) Sort Join HotelsCities Initial Plan Sort Cache(h,c) Join HotelsCities 1. Push Down Cache(h) Join HotelsCache(c) Cities 2. Push Down Push-down reduces the cost of non-reductive operations without causing irrelevant objects being copied
13
Caching at the Bottom Heuristics Policy –cache all candidate collections –increase cost of other operations as little as possible (i.e., early caching) Algorithm –extend optimizer to produce plan with Cache operators as low as possible (details in paper) –pull-up Cache operators through pipeline Pull-up reduces the number of irrelevant objects that are cached without increasing the cost of pipelined operators
14
14 Cost-based Cache Operator Placement Try to find the best possible plan –Cache operators only if they are benefitial –Find best place for Cache operators in plan –Join order and site selection depends on caching Extend the query optimizer –enumerate all possible Caching plans –estimate cost and benefit of Cache operators –extended pruning condition for dyn. programming
15
15 Enumerating all Caching Plans HotelsCities Join Cache(h,c) HotelsCities Join Cache(h)Cache(c) HotelsCities Join Cache(h) HotelsCities Join Plans with Join at the Server Plans with Join at the Client HotelsCities Join Cache(h) HotelsCities Join Cache(c)
16
16 Costing of Cache Operators Overhead of Cache Operators –cost to probe hash table for every object –cost to copy objects which are not yet cached Benefit of Cache Operators –savings: relevant objects are not refetched –savings depend on costs to fault-in object and current state of the cache Cost = Overhead - Benefit –only Cache operators with Cost < 0 are useful
17
17 Summary of Approaches Heuristics –simple to implement –not much additional optimization overhead –poor plans in certain situations Cost-based –very good plans –huge search space, slows down query optimizer
18
18 Performance Experiments Test Environment –Garlic heterogeneous database system –UDB, Lotus Notes, WWW servers Benchmark –relational BUCKY benchmark database –simple queries to multi-way cross-source joins –simple accessor methods
19
19 Application Run Time (secs) single-table query + accessor method
20
20 Application Run Time (secs) three-way joins + accessor method
21
21 Query Optimization Times(secs) vary number of candidate collections
22
22 Conclusions Loading the cache with query results can result in huge wins –for search & work applications –if client-server interaction is expensive Use cost-based approach for simple queries –four or less candidate collections Use heuristics for complex queries Caching at Bottom heuristics is always at least as good as traditional, do-nothing approach
23
23 Future Work Explore full range of possible approaches –e.g. cost-based Cache operator pull-up and push-down Consider tradeoff of optimization time and application run time (meta optimization) –invest in optimization time only if high gains in application run-time can be expected –consider state of the cache, dynamic optimization
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.