Download presentation
Presentation is loading. Please wait.
Published byFlorence Harris Modified over 8 years ago
1
Virtualized Execution Realizing Network Infrastructures Enhancing Reliability Application Communities PI Meeting Arlington, VA July 10, 2007
2
2 Application Communities PI Meeting July 10, 2007 Outline Progress summary Architecture update Binary quasi-static analysis Syzygy: distributed detection of anomalous application behavior Looking ahead: test and evaluation Preview of DARPATech 2007 demo
3
3 Application Communities PI Meeting July 10, 2007 Progress Summary Detection –Syzygy Implementation and integration Preliminary tests and evaluation Principal detection capability for DARPATech demo –Quasi-static Implementation of offline model construction and runtime monitoring Preliminary tests and evaluation Demonstration of detection capability on a real exploit Diagnosis –Configuration diagnosis Expanded diagnosis results to sets of features, rather than single, rarest feature Expanded comparative diagnosis to factor in the health status of nodes
4
4 Application Communities PI Meeting July 10, 2007 Progress Summary (2) Response –Dynamic firewall Temporary, targeted restriction on normally permitted traffic Status: mostly implemented (a few integration details remain) –Fine-grained uninstaller Implemented and integrated Enhanced with UI to inform community users of proposed response and provide an opportunity to block it
5
5 Application Communities PI Meeting July 10, 2007 Progress Summary (3) System development –Enhanced testbed automation to facilitate experimentation and evaluation Individual users (researchers and developers) can more easily setup, startup, and shutdown separate VERNIER application communities –Prototype situation awareness monitor and user interface –Generalized APIs to ease future integration –Emergence of first-generation integrated VERNIER system –Demonstration for DARPATech 2007
6
Architecture Update
7
Binary Quasi-static Analysis
8
Syzygy
9
Test and Evaluation
10
10 Application Communities PI Meeting July 10, 2007 Measuring Success Long-standing issue: how will the extent to which VERNIER succeeds in its goals be measured? Metrics –We have posed a set of general metrics (detection false positives and false negatives, effectiveness of response, performance overhead, response time) with specific numbers –A context and framework for evaluation are needed to make those metrics meaningful To establish an evaluation framework, we must define the intended scope of VERNIER application community protection –Threats considered, threats not considered –Scope of detection –Scope of threat mitigation and recovery from impairment
11
11 Application Communities PI Meeting July 10, 2007 VERNIER Scope of Protection High-level goal: maintain normal functionality of application communities –Maximize availability of correctly functioning application resources for the intended purposes of end users Detect conditions indicative of potential damage Limit the spread of such conditions Remediate damage when it occurs –Community resilience in the face of localized failures Areas of focus –Availability and integrity; confidentiality is secondary –End-user applications, not services Much more community knowledge to leverage in the application space Protection of services may be better addressed by preventive strategies
12
12 Application Communities PI Meeting July 10, 2007 Threat Scope VERNIER focus: loss of control –Loss of availability of legitimate functionality –Loss of integrity that could enable further attacks Loss of control may take a variety of forms –Execution of malicious or erroneous code –Unintended modification of dynamic application state –Unintended modification of static application configuration –Unintended modification of operating system configuration and state Loss of control may be achieved in a variety of ways –Remote exploitation of networking vulnerabilities –Application vulnerabilities exploited through the distribution of malicious data –Deception, including Trojan-horse software and social engineering –New bugs or errors introduced by new software or configuration
13
13 Application Communities PI Meeting July 10, 2007 Threats out of Scope Insider threats –Not primarily an issue of weakness of COTS monocultures Breaches of confidentiality –VERNIER’s defenses against loss of integrity do help, but serious breaches can result from even small data extraction from one node –Not much opportunity for community leverage in defense, nor much that can be done to recover Ubiquitous bugs –Where there is some variation, we have an opportunity –When identical bugs are present everywhere, we don’t External network and server failures –VERNIER cannot protect what it does not directly control Legitimate load failures –We do not protect against failures that result from an excess of legitmate load, such as a user running too many simultaneous applications
14
14 Application Communities PI Meeting July 10, 2007 Scope of Detection Detection CategoryDetectorsTypeAutomated? Incorrect application behavior Syzygy (correlated application anomalies) Anomaly Quasi-staticAnomaly Malicious process Host-based bot detection (BotSwat) Signature Hidden process (Rootkit)Signature Network traffic Network-based bot detection (BotHunter) Signature User impairmentUser reportAnomaly Configuration change Community prevalenceWeak anomaly Change detectionWeak anomaly
15
15 Application Communities PI Meeting July 10, 2007 Scope of Response CategoryResponseTypeGranularity Network configuration Dynamic firewallMitigationFine Node quarantineMitigationCoarse System configuration Software blacklistMitigationFine Fine-grained uninstaller RecoveryFine System state Process terminationMitigation & RecoveryFine Pre-impairment rollback RecoveryCoarse
16
16 Application Communities PI Meeting July 10, 2007 Approaching Evaluation A possible approach: a set of red-team “games” in two categories –Component-oriented tests Test the effectiveness of specific components Example: measure FP/FN rates for specific detectors –End-to-end scenarios Evaluate the system as a whole against go/no-go metrics in a set of end-to-end attack scenarios To be defined: –“Scoring” system and baselines –Ground rules What the red team can and cannot do What the blue team can and cannot do
17
Preview of DARPATech Demo
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.