Jerry SussmanPage 1 of 61 Class Discussion “Analyzing the MAC-levelBehavior of Wireless Networks in the Wild” Discussion Guided by Jerry Sussman.

Slides:



Advertisements
Similar presentations
Analyzing the MAC-level behavior of wireless networks in the wild Ratul Mahajan (Microsoft Research) Maya Rodrig, David Wetherall, John Zahorjan (University.
Advertisements

1 Evaluation Rong Jin. 2 Evaluation  Evaluation is key to building effective and efficient search engines usually carried out in controlled experiments.
Variance reduction techniques. 2 Introduction Simulation models should be coded such that they are efficient. Efficiency in terms of programming ensures.
RTP: A Transport Protocol for Real-Time Applications Provides end-to-end delivery services for data with real-time characteristics, such as interactive.
Doc.: IEEE /0604r1 Submission May 2014 Slide 1 Modeling and Evaluating Variable Bit rate Video Steaming for ax Date: Authors:
Multiple Access Methods. When nodes or stations are connected and use a common link (cable or air), called a multipoint or broadcast link, we need a.
LECTURE 11 CT1303 LAN. DYNAMIC MAC PROTOCOL No fixed assignment for transmission media or any network resources.. It allows transmission when needed.
Sampling and Flow Measurement Eric Purpus 5/18/04.
1 Fall 2005 Hardware Addressing and Frame Identification Qutaibah Malluhi CSE Department Qatar University.
PEDS September 18, 2006 Power Efficient System for Sensor Networks1 S. Coleri, A. Puri and P. Varaiya UC Berkeley Eighth IEEE International Symposium on.
Random Access MAC for Efficient Broadcast Support in Ad Hoc Networks Ken Tang, Mario Gerla Computer Science Department University of California, Los Angeles.
CS 290C: Formal Models for Web Software Lecture 10: Language Based Modeling and Analysis of Navigation Errors Instructor: Tevfik Bultan.
He Huang Introduction:The Flooding Time Synchronization Protocol.
Distributed Systems Distributed Algorithms 1 Brendan Tangney Distributed Systems (Distributed Algorithms – A Problem Based Learning Approach)
Distributed Priority Scheduling and Medium Access in Ad Hoc Networks Distributed Priority Scheduling and Medium Access in Ad Hoc Networks Vikram Kanodia.
1 Analyzing the MAC-level Behavior of Wireless Networks in the wild By: Ratul Mahajan, Maya Rodrig, David Wetherall, Zahorjan Presented by: Andy Cheng,
1 University of Freiburg Computer Networks and Telematics Prof. Christian Schindelhauer Wireless Sensor Networks 13th Lecture Christian Schindelhauer.
MAC Protocol By Ervin Kulenica & Chien Pham.
WXES2106 Network Technology Semester /2005 Chapter 8 Intermediate TCP CCNA2: Module 10.
Enhancing TCP Fairness in Ad Hoc Wireless Networks Using Neighborhood RED Kaixin Xu, Mario Gerla University of California, Los Angeles {xkx,
Network Measurement Bandwidth Analysis. Why measure bandwidth? Network congestion has increased tremendously. Network congestion has increased tremendously.
5-1 Data Link Layer r What is Data Link Layer? r Wireless Networks m Wi-Fi (Wireless LAN) r Comparison with Ethernet.
Project Documentation and its use in Testing JTALKS.
Wireless scheduling analysis (With ns3) By Pradeep Prathik Saisundatr.
Timing-sync Protocol for Sensor Networks (TPSN) Presenter: Ke Gao Instructor: Yingshu Li.
S/W Project Management
Achieving Better Reliability With Software Reliability Engineering Russel D’Souza Russel D’Souza.
Chapter 4: Managing LAN Traffic
RTS/CTS-Induced Congestion in Ad Hoc Wireless LANs Saikat Ray, Jeffrey B. Carruthers, and David Starobinski Department of Electrical and Computer Engineering.
1 Validation & Verification Chapter VALIDATION & VERIFICATION Very Difficult Very Important Conceptually distinct, but performed simultaneously.
1 Semester 2 Module 10 Intermediate TCP/IP Yuda college of business James Chen
ITEC224 Database Programming
جلسه دهم شبکه های کامپیوتری به نــــــــــــام خدا.
CWNA Guide to Wireless LANs, Second Edition
Network Coding Testbed Jeremy Bergan, Ben Green, Alex Lee.
Centro de Estudos e Sistemas Avançados do Recife PMBOK - Chapter 4 Project Integration Management.
Computer Networks Performance Metrics. Performance Metrics Outline Generic Performance Metrics Network performance Measures Components of Hop and End-to-End.
University of the Western Cape Chapter 12: The Transport Layer.
Algorithms for Allocating Wavelength Converters in All-Optical Networks Authors: Goaxi Xiao and Yiu-Wing Leung Presented by: Douglas L. Potts CEG 790 Summer.
Analysis of Simulation Results Chapter 25. Overview  Analysis of Simulation Results  Model Verification Techniques  Model Validation Techniques  Transient.
Written by Yu-Chung Cheng, John Bellardo, Peter Benko, Alex C. Snoeren, Geoffrey M. Voelker and Stefan Savage Written by Yu-Chung Cheng, John Bellardo,
MOJO: A Distributed Physical Layer Anomaly Detection System for WLANs Richard D. Gopaul CSCI 388.
November 4, 2003APOC 2003 Wuhan, China 1/14 Demand Based Bandwidth Assignment MAC Protocol for Wireless LANs Presented by Ruibiao Qiu Department of Computer.
Demand Based Bandwidth Assignment MAC Protocol for Wireless LANs K.Murugan, B.Dushyanth, E.Gunasekaran S.Arivuthokai, RS.Bhuvaneswaran, S.Shanmugavel.
Copyright © 1994 Carnegie Mellon University Disciplined Software Engineering - Lecture 3 1 Software Size Estimation I Material adapted from: Disciplined.
Load-Balancing Routing in Multichannel Hybrid Wireless Networks With Single Network Interface So, J.; Vaidya, N. H.; Vehicular Technology, IEEE Transactions.
Security+ Guide to Network Security Fundamentals, Third Edition Chapter 9 Performing Vulnerability Assessments.
Chapter 9 Hardware Addressing and Frame Type Identification 1.Delivering and sending packets 2.Hardware addressing: specifying a destination 3. Broadcasting.
ECE 256: Wireless Networking and Mobile Computing
Chapter 10 Verification and Validation of Simulation Models
Improving Loss Resilience with Multi- Radio Diversity in Wireless Networks by Allen Miu, Hari Balakrishnan and C.E. Koksal Appeared in ACM MOBICOM 2005,
Content caching and scheduling in wireless networks with elastic and inelastic traffic Group-VI 09CS CS CS30020 Performance Modelling in Computer.
1 Order Reconstruction and Data Integrity Testing of Sensor Network Data Matthias Keller, ETH Zürich MICS Workshop,
Understanding Network Architecture CHAPTER FOUR. The Function of Access Methods The set of rules that defines how a computer puts data onto the network.
ANALYSIS PHASE OF BUSINESS SYSTEM DEVELOPMENT METHODOLOGY.
A Bandwidth Scheduling Algorithm Based on Minimum Interference Traffic in Mesh Mode Xu-Yajing, Li-ZhiTao, Zhong-XiuFang and Xu-HuiMin International Conference.
Optimization Problems in Wireless Coding Networks Alex Sprintson Computer Engineering Group Department of Electrical and Computer Engineering.
CCNA3 Module 4 Brierley Module 4. CCNA3 Module 4 Brierley Topics LAN congestion and its effect on network performance Advantages of LAN segmentation in.
TCP/IP1 Address Resolution Protocol Internet uses IP address to recognize a computer. But IP address needs to be translated to physical address (NIC).
Building Valid, Credible & Appropriately Detailed Simulation Models
Multiple Access Methods
RTP: A Transport Protocol for Real-Time Applications
CS222 Web Programming Course Outline
Multiple Access Methods
Lecture 5- Data Link Layer
Dhruv Gupta EEC 273 class project Prof. Chen-Nee Chuah
Multiple Access Methods
M. Kezunovic (P.I.) S. S. Luo D. Ristanovic Texas A&M University
Aug Project: IEEE P Working Group for Wireless Personal Area Networks (WPANs) Submission Title: [Explanation and Revision of Previous Time.
Cooperative AP Discovery
Presentation transcript:

Jerry SussmanPage 1 of 61 Class Discussion “Analyzing the MAC-levelBehavior of Wireless Networks in the Wild” Discussion Guided by Jerry Sussman

Page 2 of 61 Critique Guidance Critique Instructions: –Critique the paper, not the me! –All students should read the paper before class –Critique is due prior to the following week’s class –Discussion Leader MUST use slides to guide the discussion –Critiques should be organized / structured per website –This is a 300-level course….there should be a lot of discussion.

Jerry SussmanPage 3 of 61 Critique Guidance (10%) State the problem the paper is trying to solve. (20%) State the main contribution of the paper: solving a new problem, proposing a new algorithm, or presenting a new evaluation (analysis). If a new problem, why was the problem important? Is the problem still important today? Will the problem be important tomorrow? If a new algorithm or new evaluation (analysis), what are the improvements over previous algorithms or evaluations? How do they come up with the new algorithm or evaluation? (15%) Summarize the (at most) 3 key main ideas (each in 1 sentence.) (30%) Critique the main contribution –Rate the significance of the paper on a scale of 5 (breakthrough), 4 (significant contribution), 3 (modest contribution), 2 (incremental contribution), 1 (no contribution or negative contribution). Explain your rating in a sentence or two. –Rate how convincing the methodology is: how do the authors justify the solution approach or evaluation? Do the authors use arguments, analyses, experiments, simulations, or a combination of them? Do the claims and conclusions follow from the arguments, analyses or experiments? Are the assumptions realistic (at the time of the research)? Are the assumptions still valid today? Are the experiments well designed? Are there different experiments that would be more convincing? Are there other alternatives the authors should have considered? (And, of course, is the paper free of methodological errors.) –What is the most important limitation of the approach? (15%) What lessons should researchers and builders take away from this work. What (if any) questions does this work leave open? (10%) Propose your improvement on the same problem. Note: the purpose of this template is to serve as a starting point, instead of a constraint. Use your judgment and creativity. Some advice through the resource link of the class can be helpful.resource link

Jerry SussmanPage 4 of 61 Agenda Authors Summary Background Wit Theory Behind Wit Implementation of Wit Wit Evaluation Inference versus Additional Monitors Application in Live Environment Conclusion

Jerry SussmanPage 5 of 61 Authors Ratual MahajanMicrosoft Maya RodrigUniversity of Washington David Wetherall University of Washington John ZahorjanUniversity of Washington FundingNSF Presented SIGCOMM’06 September 11-15, 2006 Pisa, Italy

Jerry SussmanPage 6 of 61 Summary First Paper Documents WIT –Passive Wireless Analysis Tool –Analyzes MAC-Level behavior on Wireless Networks Paper Assesses WIT Performance –Based on Real & Simulated Data Authors tested WIT against live Wireless Network

Jerry SussmanPage 7 of 61 Why Is WIT Needed? ???

Jerry SussmanPage 8 of 61 Why Is WIT Needed? –Understand how live networks communicate in different situations: Highly loaded environment Low load environments Interfering wireless LANs, etc. –Critical to knowing how to improve performance of wireless LANs.

Jerry SussmanPage 9 of 61 Background Measurement-driven analysis of live networks –Critical to understanding live performance of networks –Critical to improving performance Measurement-driven refers to: –Part Measured / Collected data –Part ‘generated’ data

Jerry SussmanPage 10 of 61 Background Wireless Measurement-Driven Analysis –At time of paper publication, Lacking in: Software Collection/Analysis Tools Performance data from wireless networks Reasons: –Based on Simple Network Mgt Protocol (SNMP) logs from AP –AP logs »Low fidelity (i.e. course logs) of AP Side »No data from client view –Packet traces from Wired hosts next to AP »Traces omit wireless retransmissions

Jerry SussmanPage 11 of 61 Background Unrealistic Solution –Instrument entire wireless network Proven Successful in control environments Unrealistic and not a match for commercial application Only Realistic Solution –Obtain trace via passive monitoring 1 or more nodes declared “monitors” Monitors placed in vicinity of wireless network Record attributes of all transmissions –Trivial to deploy

Jerry SussmanPage 12 of 61 Background Problems with “Passive Monitoring” –Data / Traces may be incomplete Packets dropped due to weak signal Packets dropped due to collisions –Difficult to know what packets are missing from a monitor –Monitor stations can’t determine if destination properly received packets Important for determining reception probability

Jerry SussmanPage 13 of 61 Background This paper is trying to: –Find a way to assemble an accurate trace of wireless environment for analysis Use data from multiple monitoring stations Determine missing packets Re-create missing packets Combine into single Trace file –Determine Network Performance How often do clients retransmit their packets Determine loss effects between two nodes Effect of increased load on the network

Jerry SussmanPage 14 of 61 Background Authors attempt to solve problem with WIT: –Paper presents WIT, a tool for Measurement- Driven Analysis. –WIT has 3 modules which solve key problems identified earlier

Jerry SussmanPage 15 of 61 Wit

Jerry SussmanPage 16 of 61 Why Is WIT Needed? Quantify Wireless Network Performance Estimate # of competing stations Assist in diagnosing wireless network problems

Jerry SussmanPage 17 of 61 WIT Core Processing Steps 1.Merging procedure 2.Packet Reconstruction 3.Determination of Network Performance

Jerry SussmanPage 18 of 61 Merging procedure {1 st Core Processing Step} Combine incomplete traces from multiple, independent monitors Provides a complete trace for follow- on steps Based upon collected date –Not inferred or reconstructed

Jerry SussmanPage 19 of 61 Packet Reconstruction {Second Core Processing Step} Reconstructs packets not captured by any monitor –Strong inference engine –Determines if packet received at destination –Again, provides more complete trace for follow-on step

Jerry SussmanPage 20 of 61 Determination of Network Performance {Third Core Processing Step} WIT Calculates Network Performance –Input:Constructed trace –Output: Typical simple network measurements Packet reception probabilities Estimates number of nodes contending for medium –Not previously achieved according to authors

Jerry SussmanPage 21 of 61 Passive Monitoring Pipeline

Jerry SussmanPage 22 of 61 WIT Evaluation After Development of WIT, Authors faced with Evaluation Task –Used mix of real and simulated data –Used WIT at SIGCOMM 2004 conference Multi-monitor traces captured Uncovered MAC-layer characteristics of environment –Network was dominated by period of low contention during which the medium was poorly utilized, even though APs were waiting to tx packets »Suggests MAC tuned for high traffic levels that are uncommon on real networks. –Authors claim this can’t be obtained by other methods

Jerry SussmanPage 23 of 61 Now for the Theory behind WIT phases {Implementation of Phases will follow….}

Jerry SussmanPage 24 of 61 3 Core Phases Merging of Traces Inferring Missing Information Deriving Measurments / Performance

Jerry SussmanPage 25 of 61 3 Core Phases Merging of Traces Inferring Missing Information Deriving Measurments / Performance

Jerry SussmanPage 26 of 61 Merging of Traces

Jerry SussmanPage 27 of 61 Merging of Traces Input: –Number of Packet traces –1 Trace per monitor –Timestamps reflect local AP Receive Packet time

Jerry SussmanPage 28 of 61 Merging of Traces Output: –Merge into single, consistent timeline for all packets observed Eliminate duplicates Assign coherent timestamps to all packets independent of monitor Timestamp accuracy to a few microseconds required. Identify and Eliminate Duplicates

Jerry SussmanPage 29 of 61 Merging of Traces Timing, the critical element –Only few packets carry info guaranteed to be unique over a few miliseconds –Only way to distinguish duplicates is by time –Accurate timestamps are vital to creating the merged trace –Reference packets are the key

Jerry SussmanPage 30 of 61 Merging of Traces Three Step Merging Process 1.Identify the reference packets common to both monitors –Beacons generated by APs as references »Contain unique source MAC address »Contain 64-bit value of local, microsecond resolution timer

Jerry SussmanPage 31 of 61 Merging of Traces Three Step Merging Process 2.Use reference timestamps to translate the time coordinates –Pair up two reference timestamps across two traces –Time interval of secondary is altered to match baseline trace –Constant added to align the two traces between the two individual reference points –Resizing / alignment process adjusts for clock drift and alignment bias between two monitors

Jerry SussmanPage 32 of 61 Merging of Traces Three Step Merging Process 3.Identify and Remove duplicates Identify by matching: –Packet Type –Same Source –Same Destination –Time stamp that is less than ½ of minimum time to transmit a packet Note: The code for this would be straight forward however I suspect much time was spent reviewing the data and proving that the code/scheme worked.

Jerry SussmanPage 33 of 61 Merging of Traces Waterfall Merging Process –Merge two traces –Then merge third trace to baseline trace Approach is not most time efficient Approach provides improved precision: –New reference points continually added –Easier to find set of shared reference points as more monitor traces merged

Jerry SussmanPage 34 of 61 3 Core Phases Merging of Traces Inferring Missing Information Deriving Measurments / Performance

Jerry SussmanPage 35 of 61 Inferring Missing Information

Jerry SussmanPage 36 of 61 Inferring Missing Information Two Fundamental Purposes: 1.Infer missing packets from collected & merged data 2.Estimate whether packets were received by their destination Authors claim this is new

Jerry SussmanPage 37 of 61 Inferring Missing Information Key Technique: –Transmitted packets imply useful data about the packets it must have received –Example: AP send ASSOCIATION RESPONSE only if it recently receive an ASSOCIATION REQUEST. If the merge trace contains the response but no request then we know request was successfully sent –Also, sender and destination of missing request are known from response packet.

Jerry SussmanPage 38 of 61 Inferring Missing Information Processing the merged trace –Scan each packet and process Classify each packet type Generate markers –Ex: Ongoing conversation end Formal Language Approach (FSM) –Infer Packet Reception –Infer Missing Packets –Construct Packets as Required

Jerry SussmanPage 39 of 61 3 Core Phases Merging of Traces Inferring Missing Information Deriving Measurements/Performance

Jerry SussmanPage 40 of 61 Deriving Measurements / Performance

Jerry SussmanPage 41 of 61 Inferring Missing Information Merged Trace Can Be Mined: Many ways to study detailed behavior –Packet Reception probability –Estimate number of stations that are competing for medium per snapshot in time –Requires access to ‘State’ –‘randomly selected backoff values –DATA & DATA retry packets

Jerry SussmanPage 42 of 61 Now for the Implementation of WIT

Jerry SussmanPage 43 of 61 WIT Implementation WIT Implented in 3 Components –halfWit –nitWit –dimWit Half, Nit, & DIM correspond to three pipeline phases discussed earlier

Jerry SussmanPage 44 of 61 WIT Implementation halfWit –Merge phase 1 st Insert all traces into database Database used to merge data as defined earlier Database also used to pass final merged trace to nitWit Uses merge-sort methodology –Traces handled like queues

Jerry SussmanPage 45 of 61 WIT Implementation nitWit –Inference phase nitWit take output of halfWit Determines and recreates missing packets Annotates captured and inferred packets –Critical annotation for each packet is whether it was received. –Retry packet fields are tracked Note: Original implementation did not ‘merge’ captured and inferred packets because exact timing uncertainty. Different than theory writeup section.

Jerry SussmanPage 46 of 61 WIT Implementation dimWit –Derived Measures Component dimWit take output of nitWit Produces summary network information Produces number of contenders in the network Implemented to analyze tens of millions of packets in a few minutes.

Jerry SussmanPage 47 of 61 Wit Evaluation

Jerry SussmanPage 48 of 61 Wit Evaluation Purpose of Evaluation: –Understand how well each phase works –Key questions to be evaluated: Quality of time synchronization? Quality of merged product? Accuracy of inferences? Fraction of missing packets inferred? Number of Contenders – accuracy? Analyze improvement from more monitors or more inference?

Jerry SussmanPage 49 of 61 Wit Evaluation Reality of this type of evaluation: –Comparing against ground truth unrealistic Too much detail Unrealistic to create absolute controlled environment –Reduced to simulation as primary validation method

Jerry SussmanPage 50 of 61 Wit Evaluation Simulated Environment: –2 Access Points (AP’s) –40 clients randomly distributed on a grid –Packet Simulator Reception probability based on: –signal strength –Transmission rate –Existing packets in environment –Random bit errors

Jerry SussmanPage 51 of 61 Wit Evaluation Simulated Environment (continued): –10 randomly distributed monitors –Detailed logs of simulated packet generation and simulated packet collection.

Jerry SussmanPage 52 of 61 Wit Evaluation Merging –Check correctness & characterize quality of time synchronization Basis for waterfall merging Inference –Check ability to infer packet reception statuses and missing packets Estimating Contenders –Run dimWit on merged traces and compare against logs

Jerry SussmanPage 53 of 61 Wit Evaluation Results –Will Limit results here to high priority end-result…the Contenders. Worst case simulation with 90% packets captured, dimWit is within % of cases In smaller simulation with 98% packets captured, estiates are within +- 1 for 95% of the cases. Closer study reveals: –High error values tend to correspond to cases with high number of contenders

Jerry SussmanPage 54 of 61 Wit Evaluation Results –Will Limit results here to high priority end-result…the Contenders. Worst case simulation with 90% packets captured, dimWit is within % of cases In smaller simulation with 98% packets captured, estiates are within +- 1 for 95% of the cases. Closer study reveals: –High error values tend to correspond to cases with high number of contenders

Jerry SussmanPage 55 of 61 Inference Versus Additional Monitors

Jerry SussmanPage 56 of 61 Inference Versus Additional Monitors Both more inference and more monitors increase quality of results Can’t Increase Both In Real Life Which Has More Bang-for-the-Buck? –Test show diminishing returns as number of monitors increase –Expected Result

Jerry SussmanPage 57 of 61 Applying To Live Environment

Jerry SussmanPage 58 of 61 Applying To Live Environment SIGCOMM 2004 Conference wireless environment –4 days –550 attendees –Large / busy setting –5 Access Points –Channels 1 and 11 –Internet via DSL access lines –Interfering Wireless Networks Number of transient wireless networks Hotel Wireless Network Private Wireless Network on Ch 6 –Montoring 24/7 During Conference

Jerry SussmanPage 59 of 61 Applying To Live Environment Results: –Successful merge trace produced for each channel –One monitor didn’t have enough references in common with merged trace so it was excluded Lesson Learned: Placement of monitors –Significant overlap in what each monitor ‘hears’ –Additional monitors increases number of unique packets in each trace True even when two monitors right next to each other Therefore, even dense array of monitors will miss packets

Jerry SussmanPage 60 of 61 Applying To Live Environment Results: –nitWit inferred that 80% of unicast packets were received by their destination –nitWit inferred that 90% of total packets were captured by the monitors –dimWit determined that Uplink to the AP was more reliable than the downlink –Medium was inefficiently utilized –Reception probability did not decrease with contention –Performance was stable at high contention levels

Jerry SussmanPage 61 of 61 Concluding Remarks Wit implementation provides wireless live data not previously available Measurement-driven analysis, implemented by Wit, successfully evaluated Further Study warranted –Will lead to increased efficiency of Wireless LANs