Presentation is loading. Please wait.

Presentation is loading. Please wait.

10/1/20071 Automatic Evaluation of Intrusion Detection Systems F. Massicotte, F. Gagnon, Y. Labich, L. Briand, Computer Security Applications Conference,

Similar presentations


Presentation on theme: "10/1/20071 Automatic Evaluation of Intrusion Detection Systems F. Massicotte, F. Gagnon, Y. Labich, L. Briand, Computer Security Applications Conference,"— Presentation transcript:

1 10/1/20071 Automatic Evaluation of Intrusion Detection Systems F. Massicotte, F. Gagnon, Y. Labich, L. Briand, Computer Security Applications Conference, ACSAC ’06, pp 361-370, 2006. Presented by: Lei WEI

2 10/1/20072 Summary 1. Proposed a strategy that is able to evaluate Intrusion Detection System (IDS) automatically and systematically 2. Evaluated two famous IDS programs, Snort 2.3.2 and Bro 0.9a9, by using this new proposed strategy. 3. Proposed a 15-class taxonomy for test results.

3 10/1/20073 Appreciative Comments: Automatization This is an automatic IDS evaluation system. Because of automation, it is possible to efficiently and systematically create a large number of sample data. “ We use 124 VEP (covering a total of 92 vulnerabilities) and 108 different target system configurations” (Automatic Evaluation of Intrusion Detection Systems) “ We use 124 VEP (covering a total of 92 vulnerabilities) and 108 different target system configurations” (Automatic Evaluation of Intrusion Detection Systems) “38 different attacks were launched against victim UNIX hosts in seven weeks of training data and two weeks of test data.” (Evaluation Intrusion Detection Systems: The 1998 DARPA Off-line Intrusion Detections Evaluation) “38 different attacks were launched against victim UNIX hosts in seven weeks of training data and two weeks of test data.” (Evaluation Intrusion Detection Systems: The 1998 DARPA Off-line Intrusion Detections Evaluation)

4 10/1/20074 Critical Comment: 1. Complicated classification Each of the collected traffic traces belongs to one of the type, TP, TN, FP and FN. According to types of all traces collected from IDS evaluation tests, the authors suggested a 15-class taxonomy for IDSes, such as, alarmist, quiet, quiet and complete detection, complete evasion etc. This does make the evaluation complicated and confused. This does make the evaluation complicated and confused.  Hard to remember all the class names  Is quiet and complete detection a subclass of quiet? No! I prefer a statistical way by calculating the following two ratios, I prefer a statistical way by calculating the following two ratios, (, ), from which we know the percentage of attack being detected and the percentage about wrong alarms.

5 10/1/20075 Critical Comment: 2. Confused diagrams In this paper, the two diagrams, Figure 5 and Figure 1, and relevant description used to represent the working process of the whole system are not clear enough. (a). A title should be “… an effective guide for scientists rapidly scanning lists of titles for information relevant to their interests.” (Scientific writing for graduate students: a manual on the teaching of scientific writing, edited by F. Peter Woodford. New York: Rockefeller University Press, 1968. ) However, neither the title nor the content provides clear explanation to the meaning of numbers in Figure5.

6 10/1/20076 Critical Comment: 2. Confused diagrams (Continue) (b). Although the article describes the steps listed in Figure1, the provided diagram does confused us to understand the structure and working process of the system. The title is Virtual network infrastructure,but the figure actually covers more stuff than that. It does not only represent Virtual network infrastructure, but also shows the working process of the subsystem. (b). Although the article describes the steps listed in Figure1, the provided diagram does confused us to understand the structure and working process of the system. The title is Virtual network infrastructure, but the figure actually covers more stuff than that. It does not only represent Virtual network infrastructure, but also shows the working process of the subsystem.

7 10/1/20077 Working process of Automatic IDS Evaluation system This system could be divided into two subsystems. The attack simulation and data collection system The attack simulation and data collection system The IDS Evaluation Framework The IDS Evaluation Framework

8 10/1/20078 1. Attack simulation and data collection system 1.Choose Vulnerability Exploitation Program (VEP) 2.Choose Configuration of the target System (e.g. IDS) Script Generation Set up Virtual Network Set up Attack Script Execute Attack Data Set Provide the virtual attacking machine the proper attack configuration (e.g. Whether apply IDS Evasion Tech.) 1.Capture attack traffic traces 2.Document the traffic traces Restore 1.Save the traffic traces and IDS alarms on the shared hard-drive 2.Restore the virtual attacker and target machines to their initial state

9 10/1/20079 Data Set IDS IDS Evaluator IDS Result Analyzer Report 2. IDS Evaluation Framework IDS Evaluator takes documented traffic traces from the Data Set IDS Evaluator provide traffic traces to each tested IDS Compare the two groups of data sets and determine whether the IDS detection succeed The collected IDS alarms are fetched by the IDS Results Analyser Generate the evaluation report

10 10/1/200710 Question This paper evaluated two open source IDSes by the new strategy. However, many IDSes have patent or copy right protection. Those creators would never reveal the weak points of their products. Is it ethical or illegal to publish the evaluations of IDS programs so that others can know the truth? Is it ethical or illegal to publish the evaluations of IDS programs so that others can know the truth?

11 10/1/200711 The End

12 10/1/200712 The 15-class taxonomy (Supplement)

13 10/1/200713 Document traffic traces (Supplement) Each traffic trace is documented by four characteristics: 1. Target system configuration 2. VEP configuration 3. Whether or not the VEP exploited the vulnerability of the target system 4. Whether or not the attack is successful


Download ppt "10/1/20071 Automatic Evaluation of Intrusion Detection Systems F. Massicotte, F. Gagnon, Y. Labich, L. Briand, Computer Security Applications Conference,"

Similar presentations


Ads by Google