Download presentation
Presentation is loading. Please wait.
Published byReginald Lee Holland Modified over 9 years ago
1
Enhancing ASSERT: Making an Accurate Testbed Friendly Ehsan Nourbakhsh, Ryan Burchfield, S. Venkatesan, Neeraj Mittal, Ravi Prakash Department of Computer Science University of Texas at Dallas 1
2
Motivation Made observations in [RFiJ] regarding experimentation In [ASSERT] we proposed nine main propositions, and built a testbed based on them – Advanced wireleSS Environment Research Testbed Our focus changed from only fidelity to also include usability – Proper usage of the testbed should not require comprehensive knowledge of the design and implementation of the testbed. 2
3
Propositions Accuracy – accurately reflect wireless network behavior Controllability – provide enough control to configure topology and environment conditions Mobility – emulate mobility of the nodes Repeatability – conduct experiments that are reproducible and easily repeatable Cost effectiveness – be cost effective in terms of hardware, manpower, space and time requirements to set up, run experiments on and maintain Data collection – provide necessary tools to collect and analyze data Resource sharing – be able to share the available resources to conduct multiple experiments without interfering with each other Multi-nodal capability – support many types of nodes Scalability – have the ability to scale to a large number of nodes 3
4
Hardware Design Site: unit for hardware design Microprocessor runs Linux Black-box view of Unit Under Test (UUT) Control and interaction with the UUT 4
5
Hardware Design (contd.) 5
6
RF board 6
7
Hardware Design (contd.) 7
8
8
9
Software Slices software is divided into slices – each slice implements a specific functionality Some of the major ones – diagnostics – user interface – experiment control – topology mapper – attenuator – UUT control 9
10
Software Slices (contd.) 10
11
Feedback Analysis Graduated from “functionality centric” to “user centric” – “it works” vs. “it is easy to work with” Ease of running experiments results in expectation of enhanced tools – easily upload their custom images to UUTs – ability to verify proper experiment start and progress – tools for debugging and investigating collected data – possibility of creating customized distribution models. 11
12
Capability and Flexibility Cable Map Topology Maker Topology Mapping Distribution Cleanup Application Parameters 12
13
Cable Map 13
14
Topology Maker 14
15
Topology Mapping Time HeuristicMean (s)Std. Dev. (s)Min (s)Max (s) None15.7913.940.6047.30 1 Desc.1.45 0.103.30 1 Asc.68.2449.6318.40183.70 2 Desc.19.5828.450.6097.70 1 Desc. + 2 Desc.0.140.150.0060.40 1 Asc. + 2 Desc.99.1769.0425.6256.00 1 Desc. + 2 Asc.4.560.963.808.00 1 Asc. + 2 Asc.0.210.310.021.20 15
16
Verification Log Viewer Remote Access 16
17
Log Viewer 17
18
TaskAverage Time (s) Create Images5.3 Reservation4.2 Synchronizing Clocks18 Upload images to UUTs17.1 Create Attenuation Model0.01 Restarting UUTs1.4 Enabling Logs0.9 Renew Reservation2.6 Run experiment1941.5 Disable Logging1.7 Abort Reservations1.2 Total (excluding run experiment)52.9 Run Time Average time for each task over three runs for a 30 minute experiment with 24 nodes 18
19
Data Collection Timestamp Adjustments in UUT Logs Log Download RSS Log 19
20
RSS Log Viewer 20 Unicast by 1013 ACK by 1007 Broadcast by 1007 Broadcast by 1013 Reception by 1007, 1016
21
References [RFiJ] Ryan Burchfield and Ehsan Nourbakhsh and Jeff Dix and Kunal Sahu and S. Venkatesan and Ravi Prakash, “RF in the Jungle: Effect of Environment Assumptions on Wireless Experiment Repeatability,” ICC 2009 [ASSERT] Ehsan Nourbakhsh and Jeff Dix and Paul Johnson and Ryan Burchfield and S. Venkatesan and Neeraj Mittal and Ravi Prakash, “ASSERT: A Wireless Networking Testbed,” TridentCom 2010 21
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.