Presentation is loading. Please wait.

Presentation is loading. Please wait.

© 2006, Carla Ellis Vague idea 1. Understand the problem, frame the questions, articulate the goals. A problem well-stated is half-solved. Why, not just.

Similar presentations


Presentation on theme: "© 2006, Carla Ellis Vague idea 1. Understand the problem, frame the questions, articulate the goals. A problem well-stated is half-solved. Why, not just."— Presentation transcript:

1 © 2006, Carla Ellis Vague idea 1. Understand the problem, frame the questions, articulate the goals. A problem well-stated is half-solved. Why, not just what “groping around” experiences Hypothesis Model Initial observations Experiment Data, analysis, interpretation Results & final Presentation Experimental Lifecycle

2 © 2006, Carla Ellis What can go wrong at this stage? Never understanding the problem well enough to crisply articulate the goals / questions / hypothesis. Getting invested in some solution before making sure a real problem exists. Getting invested in any desired result. Not being unbiased enough to follow proper methodology. –Any biases should be working against yourself. Fishing expeditions (groping around forever). Having no goals but building apparatus for it 1 st. –Swiss Army knife of simulators?

3 © 2006, Carla Ellis Strong Inference J. Pratt Progress in science advances by excluding among alternate hypotheses. Experiments should be designed to disprove a hypothesis. –A hypothesis which is not subject to being falsified doesn’t lead anywhere meaningful –Any conclusion which is not an exclusion is insecure

4 © 2006, Carla Ellis Steps 1.Devise alternative hypotheses 2.Devising experiments with alternative outcomes which will exclude hypothesis 3.Carrying our experiment to get clean result 4.Repeat with subhypotheses

5 © 2006, Carla Ellis Steps 0.Identify problem, observed phenomenon 1.Devise alternative hypotheses 2.Devising experiments with alternative outcomes which will exclude hypothesis 3.Carrying our experiment to get clean result 4.Repeat with subhypotheses

6 © 2006, Carla Ellis Steps 0.Identify problem, observed phenomenon 1.Devise alternative hypotheses 2.Devising experiments with alternative outcomes which will exclude hypothesis 3.Carrying our experiment to get clean result 4.Repeat with subhypotheses Intellectual Challenge – to do this efficiently

7 © 2006, Carla Ellis Logical Tree Our conclusion X might be invalid if alternative hypothesis 1, alternative hypothesis 2, … alternative hypothesis n We describe experiments to eliminate alternatives. We proceed along the branches not eliminated. Problem Alt 1 Alt n … Alt1a Alt1b

8 © 2006, Carla Ellis Multiple Hypotheses One can become emotionally “attached” to a single hypothesis –Temptation to demonstrate it is right, make facts fit the theory. Multiple working hypotheses turns research into a competition among ideas rather than among personal agendas –Gets at the issue of bias

9 © 2006, Carla Ellis “Support Activities” in Science Surveys and taxonomy Experimental infrastructure development Measurements and tables (e.g. file system usage studies) Theoretical/abstract models Useful, provided they contribute to chain of discovery but not as ends in themselves.

10 © 2006, Carla Ellis The Question Apply to one’s own thinking (but useful in someone else’s talk) What experiment could disprove your hypothesis? or What hypothesis does your experiment disprove?

11 © 2006, Carla Ellis Applying Strong Inference to Computer Systems Research This has not been our culture –“Mine is better than theirs” and experiments that show this affirmatively (not honestly attempted to show otherwise) –Non-hypotheses – statements that really can’t be shown to be false. “This system does what it was designed to do” (by definition). –Negative results are hard-sells to publish Issue is scientific effectiveness.

12 © 2006, Carla Ellis A Good Example Wolman et al, On the scale and performance of cooperative web proxy caching, SOSP 99 Question: Should multiple proxies cooperate in order to increase client populations, improve hit ratios, and reduce latency?

13 © 2006, Carla Ellis Logical tree Coop web caching works Increase hit ratio, ideal case Decrease object latency, ideal case … Increase hit ratio, real case

14 © 2006, Carla Ellis Experiments Web traces at UW and Microsoft Simulation: –Infinite cache size (no capacity misses) –Single proxy (sees all information, no overhead) –2 cases Ideal caching – all documents in spite of “cachability” Respecting cacheability Upper bound on performance

15 Beyond the knee, no significant improvement Single proxy enough here

16 Little impact on latency beyond small populations

17 © 2006, Carla Ellis Discussion What do you think computer scientists are doing wrong? Why doesn’t this approach seem natural to us? How can we improve? Will system research look significantly different if strong inference can be applied regularly?

18 © 2006, Carla Ellis Discussion Next Time: Exercise in Strong Inference Pick one paper that seems like an important scientific advance and recast its experimental evaluation in terms of hypotheses and experiments to exclude (as a logical tree).

19 © 2006, Carla Ellis Vague idea 1. Understand the problem, frame the questions, articulate the goals. A problem well-stated is half-solved. Why, not just what “groping around” experiences Hypothesis Model Initial observations Experiment Data, analysis, interpretation Results & final Presentation Experimental Lifecycle

20 © 2006, Carla Ellis Example: PACS 03 Vague idea: there should be “interesting” interactions between DVS (dynamic voltage scaling of the CPU) and memory, especially PADRAM (power-aware memory) –DVS: in soft real-time applications, slow down CPU speed and reduce supply voltage so as to just meet the deadlines. –PADRAM: when there are no memory accesses pending, transition memory chip into lower power state –Intuition: DVS will affect the length of memory idle gaps

21 © 2006, Carla Ellis Back of the Envelope What information do you need to know? Xscale range – 50MHz,.65V, 15mW to 1GHz, 1.75V, 2.2W Fully active mem – 300mW nap – 30mW w. 60ns extra latency E = P * t

22 © 2006, Carla Ellis Power Aware Memory Standby 180mW Active 300mW Power Down 3mW Nap 30mW Read/Write Transaction +6 ns +6000 ns +60 ns RDRAM Power States

23 © 2006, Carla Ellis Determining Thresholds in Power State Transitions If (gap > benefit boundary) threshold = 0 // but gap unknown else threshold = 8

24 © 2006, Carla Ellis Arriving at Hypothesis The best speed/voltage choice for DVS to minimize energy consumption when idle memory can power down is the lowest speed that is able to meet deadline (i.e., the same conclusion made by most DVS studies without memory).

25 © 2006, Carla Ellis Back of the Envelope (SEESAW) What information do we need to know? Sending sW Receiving rW Listening iW Sleeping zW

26 © 2006, Carla Ellis Hypothesis (SEESAW) Asymmetric MAC protocol can extend network lifetime by balancing energy consumption (battery depletion) –An asymmetric protocol does not waste energy in control overhead, in message loss and retransmission. –An asymmetric protocol can be automatically tuned. can be hand-tuned. can be tuned off-line algorithmically –An asymmetric protocol has acceptable performance message latency Message throughput –There is opportunity in balancing.

27 © 2006, Carla Ellis Preliminary Experiences (FaceOff) Low fidelity image allowing only face detection is adequate for prototype

28 © 2006, Carla Ellis Preliminary Data (FaceOff) Kernel Compile – feasibility for saving energy

29 © 2006, Carla Ellis Back of the Envelope (CAFE) What information do we need to know? capture storage dissemination faceoff

30 © 2006, Carla Ellis Hypothesis (CAFÉ) Controlling the fidelity of sensor or context information is


Download ppt "© 2006, Carla Ellis Vague idea 1. Understand the problem, frame the questions, articulate the goals. A problem well-stated is half-solved. Why, not just."

Similar presentations


Ads by Google