Download presentation
Presentation is loading. Please wait.
Published byPreston Glenn Modified over 9 years ago
1
Grant Pannell
2
Intrusion Detection Systems Attempt to detect unauthorized activity CIA – Confidentiality, Integrity, Availability Commonly network-based Obsolete? Network traffic encryption Moving to host-based Honeypots (emulated services) Application’s execution flow Behavior of the user
3
Detection Methods Misuse Detection Rule-based User states: I use Notepad, not WordPad Low false-positives, high detection Can’t predict and learn how a user behaves Anomaly Detection Gather audit data (user’s actions) over time Analyze with statistical methods Create a profile – User uses Notepad, system learns Higher false-positives, lower detection rate Combination of both is best
4
Profiling a User Must determine “normal” behavior for anomaly detection User Profile Characteristics: Applications running Number of Windows, Number of Processes Performance of running applications (CPU usage) Keystrokes (delays, speed) Websites visited
5
Motivation Determine unauthorized use Adoption of encryption of network traffic Multiple characteristics Previous studies focus on single characteristics for profiling Microsoft Windows - graphical user interface Previous studies focus on command usage
6
So, what is it exactly? A behavioral host-based intrusion detection system That profiles a user, using multiple characteristics… To detect unauthorized use of a machine … That will run on Microsoft Windows, to take advantage of GUI characteristics
7
Research Questions Is it possible? Feasible? Real-world? Possible in a graphical user interface environment? Combination of characteristics improves performance? Taxes system resources? Detection performance? Low false-positives (disallowed authorized users) High detection rate (disallowed intruders) Detect in a practical amount of time?
8
Literature Review Not much research in the public domain… Behavioural Intrusion Models Dates back to 1980 by Anderson Manually collect Audit Trails from machines Track file and resource access Furthered by Denning (1987) Detailed model of Anderson’s work Tan (1995), Gunetti et al.(1999), Balajinath et al. (2001), Pillai (2004) All based on UNIX Characterizes by command usage or performance (CPU, Memory, I/O, etc.) Different due to the learning algorithm used
9
Methodology Developed System Developed in Microsoft.NET C# Allow each characteristic to be “snapped-in” Extensive logging output for analysis and testing 7 Systems Test 2 “Power Users” (Win7 x64, XP x64) 2 Office Based (2x XP x86) 1 Gaming (Vista x64) 2 Web Browsing (Vista x86, XP x86)
10
Methodology Learning Mode for ~10 days System worked for 28880 collections then disabled itself “Perfect” Learning All false positives Decreasing false-positives over time (learning) Detection Mode after 10 days Only used to break the profile Used to determine how long it takes to break the profile Stress test each characteristic
11
Prototype Architecture
12
Algorithms CPU & Memory Usage 3 Techniques: Standard Deviation (0.5 Pts) (Last 120 Values) Rolling Average (1 Pts (Overall) Sliding Limit (2 Pts) (Overall) Websites Viewed Can only check if user visits new sites, not if revisiting them Rolling average New sites per hour, but check every 30 seconds Works for learning two cases Many new sites per hour No new sites per hour
13
Algorithms Number of Windows Wanted to check Window Titles and Positions Titles, never static (i.e. “ - MS Word”) Positions, seeming random for most windows Rolling average like Websites Viewed Not always accurate Number of Processes Sliding limits Fully learned profile should include all processes Therefore deny all new?
14
Algorithms Keystroke Usage Use digraphs D->i, i-> g, g->r, r->a, a->p, p->h, h->s Delay between digraphs Standard Deviations Collect last 100 values Overall Scoring System Directly related to User Activity (2 Pts) Keystrokes, Number of Windows, Websites Viewed Indirectly related (Application Profiling) (1 Pt) CPU Usage, Memory Usage, Number of Processes
15
False-Positives vs. Number of Collections (Time) (CPU Usage)
16
False-Positives per Machine (Memory Usage)
17
False-Positives per Machine (Num Windows)
18
False-Positives vs. Number of Collections (Time) (Websites Viewed)
19
False-Positives vs. Number of Collections (Time) (Keystroke Usage)
20
False-Positives vs. Number of Collections (Time) (Overall Scoring)
21
False Positive Rate per Characteristic
22
Results - Intrusions Test intrusions in Detection Mode Trying to trigger each characteristic Keystrokes – another user’s patterns Only using mouse to open many new processes and windows Use running processes, attempt abnormalities Completely new user on same profile Scoring system 5 point maximum 2 points for directly related 1 point for indirectly related Minimum 3 accumulations (3*30 secs) to trigger
23
Average Time to Detect Intrusions per Intrusion Test
24
Further Research Time block testing Categorization Mouse clicks More complex learning algorithms Intruder has physical access to the machine System Performance
25
Conclusion Is it possible? Feasible? Real-world? Better on directly related characteristics Possible in a graphical user interface environment? GUI objects turned out to be not as useful as first proposed Combination of characteristics improves performance? Scoring system lowers false-positives Taxes system resources? Large history, real-time typing analysis could be better Detection performance? Highest false-positive rates at 4.5% with a malfunctioning characteristic Detect in a practical amount of time? 90 - 180 second detection times
26
Questions? ?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.