Download presentation
Presentation is loading. Please wait.
Published byHoratio Beasley Modified over 9 years ago
1
Who Is Peeping at Your Passwords at Starbucks? To Catch an Evil Twin Access Point DSN 2010 Yimin Song, Texas A&M University Chao Yang, Texas A&M University Guofei Gy, Texas A&M University
2
Agenda 2 Introduction Analysis Algorithm Evaluation Conclusion
3
Agenda 3 Introduction Wireless Network Review Evil Twin Attack Analysis Algorithm Evaluation Conclusion
4
Wireless Network Review 4 Wireless terminology AP – Access Point SSID – Service Set Identifier RSSI – Received Signal Strength Indication BSS 1 BSS 2 Internet hub, switch or router AP 802.11 CSMA/CA DIFS – Distributed Inter-Frame Spacing SIFS – Short Inter-Frame Spacing BF – Random Backoff Time sender receiver BF data SIFS ACK DIFS
5
Evil Twin Attack 5 A phishing Wi-Fi AP that looks like a legitimate one (with the same SSID name). Typically occurred near free hotspots, such as airports, cafes, hotels, and libraries. Hard to trace since they can be launched and shut off suddenly or randomly, and last only for a short time after achieving their goal.
6
Evil Twin Attack (cont.) 6 Related work Monitors radio frequency airwaves and/or additional information gathered at router/switches and then compares with a known authorized list. Monitors traffic at wired side and determines if a machine uses wired or wireless connections. Then compare the result with an authorization list to detect if the associated AP is a rogue one.
7
Agenda 7 Introduction Analysis Network Setting in This Model Problem Description Server IAT (Inter-packet Arrival Time) Algorithm Evaluation Conclusion
8
Network Setting in This Model 8 Table 1: Variables and settings in this model Protocol802.11b802.11g 11 Mbps54 Mbps 100 Mbps 3216 278 Bytes 338 Bytes 402 Bytes 375 Bytes
9
Problem Description 9 An evil twin typically still requires the good twin for Internet access. Thus, the wireless hops for a user to access Internet are actually increased. Fig. 1: Illustration of the target problem in this paper What statistics can be used to effectively distinguish one-hop and two-hop wireless channels on user side? Are there any dynamic factors in a real network environment that can affect such statistics? How to design efficient detection algorithms with the consideration of these influencing factors?
10
Server IAT 10
11
Server IAT (cont.) 11 Fig. 2: Server IAT illustration in the normal AP scenario
12
Server IAT (cont.) 12 Fig. 2: Server IAT illustration in the normal AP scenario
13
Server IAT (cont.) 13
14
Server IAT (cont.) 14 Fig. 5: IAT distribution under RSSI=50% Fig. 4: IAT distribution under RSSI=100%
15
Agenda 15 Introduction Analysis Algorithm TMM (Trained Mean Matching Algorithm) HDT (Hop Differentiating Technique) Improvement by Preprocessing Evaluation Conclusion
16
TMM 16 Trained Mean Matching Algorithm (TMM) requires knowing the distribution of Server IAT as a prior knowledge. Given a sequence of observed Server IATs, if the mean of these Server IATs has a higher likelihood of matching the trained mean of two-hop wireless channels, we conclude that the client uses two wireless network hops to communicate with the remote server indicating a likely evil twin attack, and vice versa.
17
TMM (cont.) 17
18
TMM (cont.) 18
19
TMM (cont.) 19
20
HDT 20
21
HDT (cont.) 21 Fig. 2: Server IAT illustration in the normal AP scenario Fig. 6: 6-AP IAT illustration in the normal AP scenario
22
HDT (cont.) 22 Protocol802.11b802.11g 11 Mbps54 Mbps 100 Mbps 3216 278 Bytes 338 Bytes 402 Bytes 375 Bytes
23
HDT (cont.) 23
24
Improvement by Preprocessing 24
25
Agenda 25 Introduction Analysis Algorithm Evaluation Environment Setup Datasets Effectiveness Cross Validation Conclusion
26
Environment Setup 26 Fig. 8: Environment for evil twin APFig. 7: Environment for normal AP
27
Datasets 27 RangeAB+B-C+C-DE Upper100%80%70%60%50%40%20% Lower80%70%60%50%40%20%0% AlgorithmProtocolAB+B-C+C-D HDT 802.11g0.8%0.86%3.91%3.72%4.69%7.09% 802.11b1.38%1.44%5.61%6.17%9.42%10.36% TMM 802.11g0.62%0.68%2.59%2.66%3.30%6.02% 802.11b0.99%1.04%3.33%4.72%7.44%8.29% Table 3: The percentage of filtered packets Table 2: RSSI ranges and corresponding levels
28
Effectiveness 28 Table 5: False positive rate for HDT and TMM Table 4: Detection rate for HDT and TMM AlgorithmProtocolAB+B-C+C-D HDT 802.11g99.08%98.72%93.53%94.31%87.29%81.39% 802.11b99.92%99.99%99.96%99.95%96.05%94.64% TMM 802.11g99.39%99.97%99.49%99.5%98.32%94.36% 802.11b99.81%95.43%94.81%96.09%91.94%85.71% AlgorithmProtocolAB+B-C+C-D HDT 802.11g2.19%1.41%2.06%1.93%2.48%6.52% 802.11b8.39%8.74%5.39%6.96%5.27%5.15% TMM 802.11g1.08%1.76%1.97%1.48%1.75%1.73% 802.11b0.78%1%1.07%1.27%6.65%7.01%
29
Effectiveness (cont.) 29 Fig. 9: Cumulative probability of the number of decision rounds for HDT to output a correct result
30
Effectiveness (cont.) 30 Table 7: False positive rate when number of input data in one decision round is 50 Table 6: Detection rate when number of input data in one decision round is 50 AlgorithmProtocolAB+B-C+C-D multi-HDT 802.11g99.62%100% 99.95%100% 802.11b100% multi-TMM 802.11g100%99.11%98.73%99.88%95.83%88% 802.11b100% AlgorithmProtocolAB+B-C+C-D multi-HDT 802.11g0%0.77%0% 802.11b0%0.03%0.02%0.11%0.73%0.1% multi-TMM 802.11g0%0.96%0.16%0.13%0.55%0.96% 802.11b0%1.07%1.16%1.02%1.36%1.41% Table 7: False positive rate when number of input data in one decision round is 100 AlgorithmProtocolAB+B-C+C-D multi-HDT 802.11g0% 802.11b0% 0.01% 0.02%0.01% multi-TMM 802.11g0% 802.11b0% 0.02% 0.03%
31
Effectiveness (cont.) 31 Fig. 10: Detection rate for multi-HDT using different numbers of input data in one decision round
32
Cross Validation 32 Fig. 11: Detection rate for TMM under different RSSI ranges
33
Cross Validation (cont.) 33 Fig. 12:Detection rate under different 802.11g networks
34
Cross Validation (cont.) 34 Fig. 13: False positive rate under different 802.11g networks
35
Agenda 35 Introduction Analysis Algorithm Evaluation Discussion and Conclusion Discussion Conclusion
36
Discussion 36 More wired hops? Several studies showed that the delays from the wired link is not comparable to those in the wireless link. We can trade-off for more decision rounds. Use a server within small hops. Maybe use techniques similar to “traceroute” to know the wired transfer time and then exclude/subtract them to minimize the noisy effect at wired side.
37
Discussion (cont.) 37 Will attacker increase IAT to avoid detection? Users don’t like a slow connection. Eq. 1: Attacker may delay the packet to reduce the SAIR What if some evil twin AP connect to wired network instead of using normal AP? That’s our future work.
38
Conclusion 38 We propose TMM and HDT to detect evil twin attack where TMM requires trained data and HDT doesn’t. HDT is particularly attractive because it doesn’t rely on trained knowledge or parameters, and is resilient to changes in wireless environments.
39
The End
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.