Download presentation
Presentation is loading. Please wait.
Published byNickolas Jenkins Modified over 9 years ago
1
Network-based Intrusion Detection and Prevention in Challenging and Emerging Environments: High-speed Data Center, Web 2.0, and Social Networks Yan Chen Lab for Internet and Security Technology (LIST) Department of Electrical Engineering and Computer Science Northwestern University
2
Chicago 2
3
Northwestern 3
4
4
5
Statistics Chicago: 3 rd largest city in US NU: ranked #12 by US News & World Report –Established in 1851 –~8000 undergrads McCormick School of Engineering: ranked #20 –180 faculty members – ~1400 undergrads and similar # of grad students 5
6
Statistics of McCormick National academy memberships: –National Academy of Engineering (NAE): 12 active, 7 emeriti –National Academy of Science (NAS): 3 active –Institute of Medicine (IoM): 1 emeritus –American Academy of Arts and Sciences (AAAS): 5 active, 3 emeriti –National Medal of Technology: 1 active 6
7
7 NetShield: Massive Semantics-Based Vulnerability Signature Matching for High-Speed Networks Zhichun Li, Gao Xia, Hongyu Gao, Yi Tang, Yan Chen, Bin Liu, Junchen Jiang, and Yuezhou Lv NEC Laboratories America, Inc. Northwestern University Tsinghua University 7 supplies3,026 travel5,200 capital equipment0 Facilities0 publication1,000
8
To keep network safe is a grand challenge Worms and Botnets are still popular e.g. Conficker worm outbreak in 2008 and infected 9~15 million hosts. 8
9
9 NIDS/NIPS Overview NIDS/NIPS (Network Intrusion Detection/Prevention System) Signature DB NIDS/NIPS Packets Security alerts Accuracy Speed 9
10
State Of The Art Pros Can efficiently match multiple sigs simultaneously, through DFA Can describe the syntactic context Regular expression (regex) based approaches Used by: Cisco IPS, Juniper IPS, open source Bro Cons Limited expressive power Cannot describe the semantic context Inaccurate Example:.*Abc.*\x90+de[^\r\n]{30} 10
11
11 State Of The Art Pros Directly describe semantic context Very expressive, can express the vulnerability condition exactly Accurate Vulnerability Signature [Wang et al. 04] Cons Slow! Existing approaches all use sequential matching Require protocol parsing Blaster Worm (WINRPC) Example: BIND: rpc_vers==5 && rpc_vers_minor==1 && packed_drep==\x10\x00\x00\x00 && context[0].abstract_syntax.uuid=UUID_RemoteActivation BIND-ACK: rpc_vers==5 && rpc_vers_minor==1 CALL: rpc_vers==5 && rpc_vers_minors==1 && packed_drep==\x10\x00\x00\x00 && opnum==0x00 && stub.RemoteActivationBody.actual_length>=40 && matchRE(stub.buffer, /^\x5c\x00\x5c\x00/) Good state Bad state Vulnerability Signature Vulnerability: design flaws enable the bad inputs lead the program to a bad state Bad input
12
Regex vs. Vulnerabilty Sigs Regex Context Free Context Sensitive Protocol grammar Theoretical prospectivePractical prospective HTTP chunk encoding DNS label pointers Parsing Matching Vulnerability Signature matching Regex cannot substitute parsing 12 Combining
13
Regex V.S. Vulnerabilty Sigs Regex assumes a single input Regex cannot help with combining phase Regex + Parsing cannot solve the problem Cannot simply extend regex approaches for vulnerability signatures 13
14
Motivation of NetShield 14
15
Research Challenges and Solutions 15 Challenges –Matching thousands of vulnerability signatures simultaneously Sequential matching match multiple sigs. simultaneously –High speed protocol parsing Solutions (achieving 10s Gps throughput) –An efficient algorithm which matches multiple sigs simultaneously –A tailored parsing design for high-speed signature matching –Code & ruleset release at www.nshield.org
16
16 NetShield System Architecture
17
Outline Motivation High Speed Matching for Large Rulesets High Speed Parsing Evaluation Research Contributions 17
18
18 Background Vulnerability signature basic –Use protocol semantics to express vulnerabilities –Defined on a sequence of PDUs & one predicate for each PDU –Example: ver==1 && method==“put” && len(buf)>300 Data representations –The basic data types used in predicates: numbers and strings –number operators: ==, >, =, <= –String operators: ==, match_re(.,.), len(.). Blaster Worm (WINRPC) Example: BIND: rpc_vers==5 && rpc_vers_minor==1 && packed_drep==\x10\x00\x00\x00 && context[0].abstract_syntax.uuid=UUID_RemoteActivation BIND-ACK: rpc_vers==5 && rpc_vers_minor==1 CALL: rpc_vers==5 && rpc_vers_minors==1 && packed_drep==\x10\x00\x00\x00 && opnum==0x00 && stub.RemoteActivationBody.actual_length>=40 && matchRE(stub.buffer, /^\x5c\x00\x5c\x00/)
19
19 Matching Problem Formulation Suppose we have n signatures, defined on k matching dimensions (matchers) –A matcher is a two-tuple (field, operation) or a four- tuple for the associative array elements –Translate the n signatures to a n by k table –This translation unlocks the potential of matching multiple signatures simultaneously Rule 4: URI.Filename=“fp40reg.dll” && len(Headers[“host”])>300 RuleIDMethod ==Filename ==Header == LEN 1DELETE** 2POSTHeader.php* 3*awstats.pl* 4*fp40reg.dllname==“host”; len(value)>300 5**name==“User-Agent”; len(value)>544
20
Signature Matching Basic scheme for single PDU case Refinement –Allow negative conditions –Handle array cases –Handle associative array cases –Handle mutual exclusive cases Extend to Multiple PDU Matching (MPM) –Allow checkpoints. 20
21
Difficulty of the Single PDU matching Bad News –A well-known computational geometric problem can be reduced to this problem. –And that problem has bad worst case bound O((log N) K-1 ) time or O(N K ) space (worst case ruleset) Good News –Measurement study on Snort and Cisco ruleset –The real-world rulesets are good: the matchers are selective. –With our design O(K) 21
22
Matching Algorithms Candidate Selection Algorithm 1.Pre-computation: Decides the rule order and matcher order 2.Runtime: Decomposition. Match each matcher separately and iteratively combine the results efficiently 22
23
23 Step 1: Pre-Computation Optimize the matcher order based on buffering constraint & field arrival order Rule reorder : Require Matcher 1 Don’t care Matcher 1 Require Matcher 1 Require Matcher 2 Don’t care Matcher 1 & 2 1 n
24
24 Step 2: Iterative Matching RuleIDMethod ==Filename ==Header == LEN 1DELETE** 2POSTHeader.php* 3*awstats.pl* 4*fp40reg.dllname==“host”; len(value)>300 5**name==“User-Agent”; len(value)>544 PDU={Method=POST, Filename=fp40reg.dll, Header: name=“host”, len(value)=450} S 1 ={2} Candidates after match Column 1 (method==) S2=S2=S1S1 A2A2 +B2+B2 ={2}{}+{4}={}+{4}={4} S 3 =S 2 A3+B3A3+B3 ={4}{4}+{}={4}+{}={4} Si Don’t care matcher i+1 require matcher i+1 In A i+1 R1 R2 R3
25
Complexity Analysis Merging complexity –Need k -1 merging iterations –For each iteration Merge complexity O(n) the worst case, since S i can have O(n) candidates in the worst case rulesets For real-world rulesets, # of candidates is a small constant. Therefore, O(1) –For real-world rulesets: O(k) which is the optimal we can get Three HTTP traces: avg(|S i |)<0.04 Two WINRPC traces: avg(|S i |)<1.5 25
26
Outline Motivation High Speed Matching for Large Rulesets. High Speed Parsing Evaluation Research Contribution 26
27
High Speed Parsing Design a parsing state machine Tree-based vs. Stream Parsers Keep the whole parse tree in memory Parsing and matching on the fly Parse all the nodes in the tree Only signature related fields (leaf nodes) VS. 27
28
High Speed Parsing Build an automated parser generator, UltraPAC 28
29
29 Observations array PDU PDU parse tree Leaf nodes are numbers or strings Observation 1: Only need to parse the fields related to signatures (mostly leaf nodes) Observation 2: Traditional recursive descent parsers which need one function call per node are too expensive
30
30 Efficient Parsing with State Machines Studied eight protocols: HTTP, FTP, SMTP, eMule, BitTorrent, WINRPC, SNMP and DNS as well as their vulnerability signatures Common relationship among leaf nodes Pre-construct parsing state machines based on parse trees and vulnerability signatures
31
Outline Motivation High Speed Matching for Large Rulesets. High Speed Parsing Evaluation Research Contributions 31
32
Evaluation Methodology 26GB+ Traces from Tsinghua Univ. (TH), Northwestern (NU) and DARPA Run on a P4 3.8Ghz single core PC w/ 4GB memory After TCP reassembly and preload the PDUs in memory For HTTP we have 794 vulnerability signatures which cover 973 Snort rules. For WINRPC we have 45 vulnerability signatures which cover 3,519 Snort rules Fully implemented prototype 10,000 lines of C++ and 3,000 lines of Python Deployed at a DC in Tsinghua Univ. with up to 106Mbps 32
33
Parsing Results Trace TH DNS TH WINRPC NU WINRPC TH HTTP NU HTTP DARPA HTTP Avg flow len (B) 778795966.6K55K2.1K Throughput (Gbps) Binpac Our parser 0.31 3.43 1.41 16.2 1.11 12.9 2.10 7.46 14.2 44.4 1.69 6.67 Speed up ratio 11.211.511.63.63.13.9 Max. memory per connection (bytes) 1615 14 33
34
Parsing+Matching Results TraceTH WINRPC NU WINRPC TH HTTP NU HTTP DARPA HTTP Avg flow length (B) 8795966.6K55K2.1K Throughput (Gbps) Sequential CS Matching 10.68 14.37 9.23 10.61 0.34 2.63 2.37 17.63 0.28 1.85 Matching only time speedup ratio 41.811.311.78.8 Avg # of Candidates 1.161.480.0330.0380.0023 Avg. memory per connection (bytes) 32 28 11.0 8-core 34
35
Scalability Results Performance decrease gracefully 35
36
36 Accuracy Results Create two polymorphic WINRPC exploits which bypass the original Snort rules but detect accurately by our scheme. For 10-minute “clean” HTTP trace, Snort reported 42 alerts, NetShield reported 0 alerts. Manually verify the 42 alerts are false positives
37
Research Contribution Regular ExpressionExists Vul. IDSNetShield AccuracyPoorGood SpeedGoodPoorGood MemoryGood??Good Multiple sig. matching candidate selection algorithm Parsing parsing state machine Tools at www.nshield.org Make vulnerability signature a practical solution for NIDS/NIPS 37
38
38 Q&A
39
4. Vulnerability Signature Matching for Large Ruleset Complexity Analysis Three HTTP traces: avg(|S i |)<0.04 Two WINRPC traces: avg(|S i |)<1.5 Merging complexity Need k-1 merging iterations For each iteration Merge complexity O(n) the worst case, since S i can have O(n) candidates in the worst case rulesets For real-world rulesets, # of candidates is a small constant. Therefore, O(1) For real-world rulesets: O(k), which is the optimal case
40
40 Example for WINRPC Rectangles are states Parsing variables: R 0.. R 4 0.61 instruction/byte for BIND PDU
41
41 Parser generator We reuse the front-end of BinPAC (a Yacc like tool for protocol parsing) Redesign the backend to generate the parsing state machine based parser
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.