Download presentation
Published byShea Bellock Modified over 9 years ago
1
Packet Classification using Hierarchical Intelligent Cuttings
Pankaj Gupta and Nick McKeown Stanford University {pankaj, Hot Interconnects VII August 18, 1999
2
Outline Introduction and Motivation Overview of the proposed algorithm
Details of the algorithm Implementation Results Conclusions Packet Classification using Hierarchical Intelligent Cuttings
3
Packet Classification
HEADER ---- Predicate Action Classifier (Policy Database) Packet Classification Forwarding Engine Action Incoming Packet
4
Multi-field Packet Classification
Given a classifier with N rules, find the action associated with the highest priority rule matching an incoming packet. Example: A packet ( , , …, TCP) would have action A2 applied to it.
5
Performance Metrics of a Classification Algorithm
Data structure storage requirements Packet classification time Preprocessing time Incremental Update time
6
Previous Work
7
Bounds from Computational Geometry
Point Location among N non-overlapping regions in k dimensions takes either O(log N) time with O(Nk) space, or O(logk-1N) time with O(N) space
8
Observations No single good solution for all cases.
But real classifiers have structure. Perhaps an algorithm can exploit this structure. A heuristic hybrid scheme ….
9
Proposed Algorithm: Basic Idea
{R1, R2, R3, …, Rn} Decision Tree {R1, R3,R4} {R1, R2,R5} {R8, Rn} Binth: BinThreshold = Maximum Subset Size = 3
10
Example 2-D Classifier
11
Geometric View 255 R1 R7 R3 (0-31,0-255) P R2 128 R4 R6 R5 128 255
12
Decision Tree using Hierarchical Intelligent Cuttings (HiCuts)
With each internal node v, associate: A rectangle, or a box B(v) A set of rules, CollidingRuleSet, R(v) A HiCut C(v) = (dimension d, #partitions of B(v) across d)
13
HiCuts 255 R1 R7 R3 Y R2 128 R4 R6 R5 X 128 255
14
HiCuts 255 R3 Y R2 128 R4 R5 X 64 128
15
HiCut Decision Tree for binth = 2
(256 * 256, X, 4) Packet P(65, 130) (64*256, Y, 2) R1 R2 R2 R2 R6 R7 R4 R2 R5 R6
16
Heuristics to exploit classifier structure
Picking a suitable dimension to hicut across. Minimize the maximum number of rules into any one partition, OR Maximize the entropy of the distribution of rules across the partition, OR Maximise the different number of specifications in one dimension Picking the suitable number of partitions (HiCuts) to be made. Affects the space consumed and the classification time. Tuned by a parameter, spfac.
17
Tunable Parameters Binth, the maximum size of the set of rules at each leaf Spfac, a parameter which guides the partitioning process to choose the number of partitions
18
Implementation Results: Four dimensional real-life classifiers
40 access-lists taken from real ISP and enterprise networks Four dimensions: (Src IP, Dst IP, L4 protocol, L4 destination port) rules
19
Number of Memory Accesses Number of Rules (log scale)
Crossproducting Number of Rules (log scale) Binth = 8, spfac = 4
20
Size of the data structure
Space in KiloBytes (log scale) Number of Rules (log scale) Binth = 8 ; spfac = 4
21
Comparison with Crossproducting
Space in MegaBytes (log scale) Number of Rules (log scale) Binth = 8 ; spfac = 4
22
Time in seconds (log scale) Number of Rules (log scale)
Preprocessing Time Time in seconds (log scale) Number of Rules (log scale) Binth = 8, spfac = 4, 333MHz P-II running Linux
23
Incremental Update Time
Time in seconds (log scale) Number of Rules (log scale) Binth = 8, spfac = 4 , 333MHz P-II running Linux
24
Conclusions Exploiting the structure of classifiers is important for a good solution. The proposed HiCut packet classification scheme seems to be of practical use.
25
In the paper... Explanation of the heuristics used in building the HiCut decision tree. Detailed implementation results. Effect of the parameters binth and spfac on the depth and space characteristics. Available at:
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.