Download presentation
Presentation is loading. Please wait.
Published byAndrea Boone Modified over 9 years ago
1
AWDRAT: Architectural-Differencing, Wrappers, Diagnosis, Recovery, Adaptive Software and Trust Management Howie Shrobe: MIT CSAIL Bob Balzer: Teknowledge
2
AWDRAT: What are we trying to do? R ecovery and Regeneration A daptive Software Attack Plan Recognition Attack Plan Other Sensors: Intrusion Detectors T rust Model: Behavior Compromises Attacks D iagnosis A rchitectural Differencing System Models Wrapper Synthesizer Application Software W rapper Applications that continue to render useful services even after successful attack. Particularly for Legacy Systems
3
Overview Questions How is it done now? –Hand inserted tests, assertions, error handlers. Rarely done systematically. –For Legacy systems, you’re often SOL. How do we make a difference? –Systematizing checking, diagnosis and recovery and providing the core of each service. Risks and Mitigations –Too big or too small a system. Start Smallish (MAF editor) grow as capable. –Build Individual Facilities that Have Independent value.
4
AWDRAT: How do we show success? R ecovery and Regeneration A daptive Software Attack Plan Recognition Attack Plan Other Sensors: Intrusion Detectors T rust Model: Behavior Compromises Attacks D iagnosis A rchitectural Differencing System Models Wrapper Synthesizer Application Software W rapper Detect incorrect application behavior Correctly diagnose the cause Choose appropriate alternative method to realize goal Red Team experiments In Lab experiments
5
AWDRAT: Technical Approach R ecovery and Regeneration A daptive Software Attack Plan Recognition Attack Plan Other Sensors: Intrusion Detectors T rust Model: Behavior Compromises Attacks D iagnosis A rchitectural Differencing System Models Wrapper Synthesizer Application Software W rapper
6
The “On One Foot” Story We use an Adaptive Software that selects one of several possible methods based on expected net benefit The code is annotated by Wrapper Generators We run code in parallel with a model Wrappers send events to Architectural Differencing Deviations between model predictions and observations from the wrappers are symptoms Diagnosis infers possible compromises of the underlying resources and updates a Trust Model Recovery is effected by restoring corrupted data resources and picking new method in light of the updated Trust Model
7
Distinguishing AWDRAT & PMOP AWDRAT –Detecting misbehaving software Hijacks, overprivileged scripts, trap doors, faults PMOP –Detecting misbehaving (human) operators Malicious intent, operator error For integrated SRS system need both capabilities –Have had extensive discussions on integrating both projects together - headstart on workshop :-)
8
MAF CAF Proposed MI Approved MI Targeting TNL JEESEDC JW CHW Chem Hazard SPI TAP CHI Combat Ops AODB AS LOC Weather Hazard WH WLC ATO EDC CHW Chem Hazard CHA External JBI DemVal Dataflow (via Publish/Subscribe)
9
What We’ve Got End-To-End Demonstration (demo shortly) –Working Prototypes of AWDRAT components –Working models & rules of target application –Working integration of AWDRAT components A day late and a JVM incompatibility behind The Good – The Bad – The Ugly Architecture Visualizer (demo shortly) –Event-Sequence diagrams –Architecture dataflow
10
What We’re Missing Realistic Rules (Domain Knowledgeable) –Would be created by SMEs in real deployment Comprehensive Rule Set –Would be created by SMEs in real deployment ?? The Good – The Bad – The Ugly
11
Accommodations Java code base –Created wrapper infrastructure for Java Limited Library of Alternative Java Methods –Utilized alternative Windows Libraries Available JBI components to wrap –Detailed on next slide The Good – The Bad – The Ugly
12
Canned Component Publishes fixed output Legacy Component Code Not Available Table Lookup MAF CAF Proposed MI Approved MI Targeting TNL JEESEDC JW CHW Chem Hazard SPI TAP CHI Combat Ops AODB AS LOC Weather Hazard WH WLC ATO EDC CHW Chem Hazard CHA External JBI DemVal Dataflow (via Publish/Subscribe)
13
AWDR A T R ecovery and Regeneration A daptive Software Attack Plan Recognition Attack Plan Other Sensors: Intrusion Detectors T rust Model: Behavior Compromises Attacks D iagnosis A rchitectural Differencer System Models Wrapper Synthesizer Application Software W rapper
14
Abstract Service Quality Parameters User’s Utility Function The binding of parameters has a value to the user Resource 1,1 Resource 1,2 Resource 1,j Each method requires different resources The system selects the method which maximizes net benefit User requests A service with certain parameters Resource Cost Function The resources Used by the method Have a cost Net benefit Each method binds the settings of The control parameters in a different way Method 1 Method 2 Method n Each service can be Provided by several Methods A system adapts by having many methods for each service
15
Methods are Described at the “Plan Level” Executable Code Selection Meta Data –Resource requirements –Constraints on the resources –Qualities of service delivered given a set of resources Architectural model: –Decomposition into sub-modules –Data and Control Flow –Pre, Post and Maintain conditions for each component –Causal links between these conditions –Primitive, directly executable sub-modules –Expected and prohibited events –Timing constraints
16
(define-service (image-load [image-loaded ?image ?path]) (speed fast slow) (image-quality high low) (safety checked unchecked)) (define-method native-image-load :service image-load :features ((speed fast) (image-quality ?quality-of-image-type) (safety unchecked)) :other-parameters (?path) :resources (?image-file) :resource-constraints ([image-file-exists ?path ?image-type ?image-file] [image-type-consistent-with-method ?image-type native-image-load] [image-quality ?image-type ?quality-of-image-type]) ) (define-method pure-java-image-load :service image-load :features ((speed slow) (image-quality ?quality-of-image-type) (safety checked)) :other-parameters (?path) :resources (?image-file) :resource-constraints ([image-file-exists ?path ?image-type ?image-file] [image-type-consistent-with-method ?image-type pure-java-image-load] [image-quality ?image-type ?quality-of-image-type]) ) Selection Meta Data Language
17
(defun decide-how-to-load-images (max-value path) (let ((utility-function (utility-function-for-service 'image-load '((speed fast (>> 1.1) speed slow) (image-quality high (>> 1.5) image-quality low) (safety checked (>> 2) safety unchecked) (speed fast (>> 1.1) safety checked) (image-quality high (>> 1.2) safety checked) (speed fast (>> 1.5) image-quality high)) max-value ))) (find-em 'image-load utility-function nil (list path)))) UTILITY LANGUAGE
18
Preferences and Utility Functions Utility functions are used to assign a numerical value to a particular way of doing a task. –Utility functions are not a natural way for people to express themselves What people can state easily is their preferences –E.g. Security and Speed are twice as good as High Image Quality A typical set of preference statements: –I prefer convenience of use to high security if I’m not under attack. –I prefer high security to convenience if I’m under attack Preferences are compiled into Utility Functions
19
How to Compile a Utility Function Convert preferences into a bit-vectors of variables –Each multi-valued attribute assigned sub-vector to cover its range of values Bit-vectors form the nodes of a graph Preferences are compiled into weighted arcs Leaf nodes have value 1 Other nodes have the least value consistent with the arc weights –Compute using Dynamic Programming
20
Method Selection Given a service name and a utility function, method selection uses a prolog-like query language to: –Find relevant methods –Find resources meeting that method’s constraints –Bind the service qualities For each successful query it: –Calculates the resource cost –Calculates the utility of the service parameters –Calculates net benefit Selects method that maximizes net benefit
21
Decision Making with Compromises M is a method, the vector R is an assignment of specific resources to M Each resource R i in R can be in one of several specific modes A “resource state” RS is an assignment of a specific mode R i,j to each resource R i in R The Trust Model (via diagnosis) assigns a probability to each resource state Given a method M and a resource state RS we can calculate the vector of service qualities SQ(M, RS) that will be delivered. The Utility function U assigns a numeric value to a vector of service qualities SQ consistent with the requestor’s preferences
22
Expected Benefit of a Resourced Method Where P(RS k ) is the joint probability that each Resource R i is in the mode indicated by RS k as assigned by the Trust Model.
23
Successful Execution Each method M has a set of preconditions Pre i (M) that are necessary for M to execute successfully. Some preconditions may not be directly observable, particularly at diagnosis and recovery time. Instead diagnosis assigns a probability to each of these and to their conjunction P( Pre i (M)). The expected benefit EBsuccess of successful execution is the expected benefit conditioned by this probability:
24
Failing Execution If the preconditions of M don’t hold, then the method will fail. The failure can be assigned a cost FailCost(M) –This is ideally calculated by using a simulation model of the organzation (necessary for insider threat). –But it can be provided by table lookup The expected cost of failure is this cost weighted by the probability that the method will fail due to the preconditions not being satisfied.
25
Expected Net Cost Benefit The total expected benefit is the difference between the expected benefit of success and the expected cost of failure. TotalEB(M,R i ) = EBsuccess(M,R i ) - ECfail(M) Each vector of resources R i has a cost RC(R i ) The Expected Cost Benefit is the difference between Total Expected Benefit and the cost of the resources: ECB(M, R i ) = TotalEB(M, R i ) - RC(R i )
26
Optimal Resource Assignment The system should select that method and set of resources that maximize the Expected Cost Benefit difference
27
Decision Making: Good Case Method: NATIVE-IMAGE-LOAD Features: SPEED: FAST, IMAGE-QUALITY: HIGH, SAFETY: UNCHECKED Resources /foo/bar/baz.gif Resource Cost: 0 Failure Cost: 0 Utility 5.0 tradeoff 5.0 Method: NATIVE-IMAGE-LOAD Features: SPEED: FAST, IMAGE-QUALITY: HIGH, SAFETY: UNCHECKED Resources /foo/bar/baz.jpg Resource Cost: 0 Failure Cost: 0 Utility 5.0 tradeoff 5.0 Method: PURE-JAVA-IMAGE-LOAD Features: SPEED: SLOW, IMAGE-QUALITY: HIGH, SAFETY: CHECKED Resources /foo/bar/baz.gif Resource Cost: 0 Failure Cost: 0.0 Utility 4.444444 tradeoff 4.444444 Method: PURE-JAVA-IMAGE-LOAD Features: SPEED: SLOW, IMAGE-QUALITY: HIGH, SAFETY: CHECKED Resources /foo/bar/baz.jpg Resource Cost: 0 Failure Cost: 0.0 Utility 4.444444 tradeoff 4.444444 Best method: NATIVE-IMAGE-LOAD, Features: SPEED: FAST, IMAGE-QUALITY: HIGH, SAFETY: UNCHECKED Value: 5.0 Resources: /foo/bar/baz.gif NATIVE-IMAGE-LOAD ((SPEED FAST) (IMAGE-QUALITY HIGH) (SAFETY UNCHECKED)) ("/foo/bar/baz.gif")
28
Decision Making: Bad Case Method: NATIVE-IMAGE-LOAD Features: SPEED: FAST, IMAGE-QUALITY: HIGH, SAFETY: UNCHECKED Resources /foo/bar/baz.gif Resource Cost: 0 Failure Cost: 9.0 Utility 0.5 tradeoff -8.5 Method: NATIVE-IMAGE-LOAD Features: SPEED: FAST, IMAGE-QUALITY: HIGH, SAFETY: UNCHECKED Resources /foo/bar/baz.jpg Resource Cost: 0 Failure Cost: 9.9 Utility 0.050000004 tradeoff -9.849999 Method: PURE-JAVA-IMAGE-LOAD Features: SPEED: SLOW, IMAGE-QUALITY: HIGH, SAFETY: CHECKED Resources /foo/bar/baz.gif Resource Cost: 0 Failure Cost: 0.4995 Utility 0.0044444446 tradeoff -0.49505556 Method: PURE-JAVA-IMAGE-LOAD Features: SPEED: SLOW, IMAGE-QUALITY: HIGH, SAFETY: CHECKED Resources /foo/bar/baz.jpg Resource Cost: 0 Failure Cost: 0.49995 Utility 4.4444445e-4 tradeoff -0.49950555 Best method: PURE-JAVA-IMAGE-LOAD, Features: SPEED: SLOW, IMAGE-QUALITY: HIGH, SAFETY: CHECKED Value: -0.49505556 Resources: /foo/bar/baz.gif PURE-JAVA-IMAGE-LOAD ((SPEED SLOW) (IMAGE-QUALITY HIGH) (SAFETY CHECKED)) ("/foo/bar/baz.gif")
29
A W DRAT R ecovery and Regeneration A daptive (Decision Theoretic) Method Selection Attack Plan Recognition Attack Plan Other Sensors: Intrusion Detectors T rust Model: Behavior Compromises Attacks D iagnosis A rchitectural Differencer System Models Wrapper Synthesizer Application Software W rapper
30
Wrappers Make The System Transparent Methods are executed as raw code (particularly when it’s legacy system) How do we know what’s going on? Wrappers inserted in good places –Architectural model tells us what those are Wrappers intercept events Wrappers squirrel away important information safely
31
in in' out out' Translator inout Simulated Component Differencer Real Component in in' out out' Translator inout Simulated Component Differencer Real Component Real Component Automatically generate: Probes Plumbing Monitoring Backup Data D1 Backup Wrapper Synthesis
32
in in' out out' Translator inout Simulated Component Differencer Real Component in in' out out' Translator inout Simulated Component Differencer Real Component Real Component Provide backup copies of data resources Data Provisioning D1 Backup
33
JavaWrap Facility to insert Wrappers around Java code without changing the source code Depends on JVMTI to rewrite byte-code at class loading time. Three types of Wrappers: –Tracers: Like LISP trace facility, prints customizable entry and exit information –Monitors: Get control both before and after real method –Transformers: Get control before and after, controls whether real method is invoked and with what arguments, controls what value is returned. –Transformers are used to implement dynamic dispatch Specified at start up with XML spec: <METHOD signature= "(Ljava/lang/String;Z)V" monitor= “tek.mafMed.Mediators.ConstructMission” />
34
M M MediationCocoon M M JBI Server AWDRAT Execution Architecture JBI Client System Models A rchitectural Differencer Scripted AWDRAT Driven from History Scripts Nominal Erroneous: Premature Publication Visualizer Scripts Script Driver History Client Reconstitution Architecture Visualizer M M MediationCocoon M M JBI Server JBI Client Mixed Initiative AWDRAT w/Reconstitution One Client Live Others Scripted
35
DataFlow Demo
36
Event Diagram Demo
37
A WDRAT R ecovery and Regeneration A daptive (Decision Theoretic) Method Selection Attack Plan Recognition Attack Plan Other Sensors: Intrusion Detectors T rust Model: Behavior Compromises Attacks D iagnosis A rchitectural Differencer System Models Wrapper Synthesizer Application Software W rapper
38
Architectural Differencing The Architectural model is part of a method’s description. Architectural model is interpreted in parallel with method execution Wrappers send events to Arch Diff coordinator Coordinator checks that state of system at event time is consistent with predictions In particular it checks the prerequisite and post- conditions of each sub-module The failure of a condition check initiates diagnostic reasoning
39
Real Output Simulated Output Real Environment (Implementation) Simulated Environment (Model) in in' out out' List of Conflicts Translator inout Simulated Component Real Component Reflection Differencer Architectural Differencing
40
MAF - CAF Architectural Differencing MAF is a Flight Plan Graphic Editor –Typical GUI Program GUI actions invoke actions on core data structures: –MissionObject, Events, Legs, Sorties, Movements Differencer Checks the Validity of Basic Operations on These Data Structures –Consistency of Structures –Add operators don’t delete and actually Insert the Intended Stuff –Maintains a Running Simulation of These Operations –Uses Lisp-Java integration to do this in Lisp
41
Architecture Differencer Demo
42
AW D RAT R ecovery and Regeneration A daptive (Decision Theoretic) Method Selection Attack Plan Recognition Attack Plan Other Sensors: Intrusion Detectors T rust Model: Behavior Compromises Attacks D iagnosis A rchitectural Differencer System Models Wrapper Synthesizer Application Software W rapper
43
Model Based Diagnosis Focus is on diagnosing misbehaviors of Computations in order to assess the health of the underlying resources Given: –Plan Structure of the Computation describing expected behavior –Observation of actual behavior that deviates from expectations Produce: –Localization: which computational steps misbehaved –Characterization: what did they do wrong –Inferences about the compromised state of the computational resources involved. –Inferences about what attacks enabled the compromise to occur –The likelihood that other resources have been compromised –The likelihood that critical constraints have not been satisfied
44
Ontology of the Diagnostic Task A Computation is the execution of a piece of code on some computational Host Computations utilize a set of resources (e.g. host computers, binary executable files, databases etc.) Resources have vulnerabilities Vulnerabilities enable attacks An successful attack on a resource causes that resource to enter a compromised state A computation that utilizes a compromised resource may exhibit a misbehavior. Misbehaviors are the symptoms which initiate diagnostic activity, leading to updated assessments of the state of the computational environment. These form the Trust Model.
45
Dependency Maintenance Architectural Differencing actively checks those prerequisite, post-conditions and other constraints in the plan that are easily observable. The Diagnostic executive builds a dependency graph between checked and inferred conditions: –Post-conditions and events within a step are justified with a link to the assumption that the step executed normally and to the prerequisites conditions. –Preconditions are justified by the causal link in the plan that connects it to a set of post-conditions of prior steps If an check succeeds, that condition is justified as a premise If an check fails, diagnosis is initiated.
46
Step1 Normal Mode Post-Condition1 Preconditions Step1 Post-Condition2 Preconditions Step2 & Preconditions Step3 Checked, Treated as premise Solid arrows are P = 1 Checked, Treated as premise Step1 Dependency Chains Built by Model Simulation
47
Delay:2,4 Diagnosis with Fault Models In addition to modeling the normal behavior of each component, we provide models of known abnormal behaviors. –A “Leak Model” covers unknown failures. –These alternative behavioral models are called computational modes. The diagnostic task is to find an assignment of a mode to each computational step such that the behavior predicted by the models associated with those modes is consistent with the observations. –A set of assignments consistent with observations is a diagnosis; there may be several diagnoses. –A set of assignments at variance with observations is a conflict.
48
Step1 Normal Mode Post-Condition1 Preconditions Step1 Post-Condition2 Preconditions Step2 Step1 Abnormal Mode1 & & Preconditions Step3 Checked, Treated as premise Solid arrows are P = 1 Checked, Treated as premise Step1 Dependency Chains & Computational Modes Bogus Condition Unjustified if in abnormal mode
49
Modeling Underlying Resources The misbehavior of the software component may actually be due to a compromise of resources used in that computation. We extend the modeling framework showing the dependence of computations on the resources –Each resource has models of its state of compromise (I.e. its modes) –The modes of the resources are linked to the modes of the computation by conditional probabilities –E.g. if a computation resides on a node which hosts a parasitic process, then the computation is likely to be slowed down. Normal: Highjacked: Uses Resource Normal: Probability 90% Hacked: Probability 10% Component 1 Has models Conditional probability =.2 Conditional probability =.3 Image-1
50
Step1 Normal Mode Post-Condition1 Preconditions Step1 Post-Condition2 Preconditions Step2 Step1 HIGHJACKED Preconditions Step3 Checked, Pinned at P = 1 Host1 Normal Mode IMAGE1 Abnormal Mode P =.9 P =.8 “Logical or” probability table “Logical and” probability table “Logical and” probability table Solid arrows are P = 1 Checked, Pinned at P = 1 Step1 Bayesian Dependency Diagram Bogus Condition
51
Host1 Reads Image File Has- Vulnerability Overflow-Attack Enables Image-File Resource-type Causes Normal HighJacked Execution.5.7 Adding Attack Models An Attack Model specifies the set of attacks that are believed to be possible in the environment –Each resource has a set of vulnerabilities –Vulnerabilities enable attacks on that resource –Computational Vulnerability Analysis of the actual configuration can determine the possible attack model Given a vulnerability and an attack that can exploit the vulnerability it is possible that the attack compromised the resource with the vulnerability –This is a conditional probability
52
Step1 Normal Mode Post-Condition1 Preconditions Step1 Post-Condition2 Preconditions Step2 Step1 Abnormal Mode1 Preconditions Step3 Checked, Pinned at P = 1 Host1 Normal Mode Host1 HighJacked P =.9 P =.8 “Logical or” probability table “Logical and” probability table “Logical and” probability table Checked, Pinned at P = 1 Step1 Bayesian Dependency Diagram Bad Image File Attack P =.7 Bogus Condition
53
Diagnostic Algorithm Start with each computation step in the “normal” mode Repeat: Check for Consistency of the current model with observations If inconsistent then it’s a conflict –Add a new node to the Bayesian Dependency network This node represents the logical-and of the modes in the conflict. It’s truth-value is pinned at FALSE. –Prune out all possible solutions which are a super-set of the conflict set. –Pick another set of models from the remaining solutions If consistent, Add to the set of possible diagnoses Continue until all inconsistent sets of models (conflicts) are found Solve the Bayesian network
54
What the Bayesian Network Tells You After adding all conflict nodes to the Bayesian network: The posterior probabilities of the underlying resource modes tell you how likely each compromised (or healthy) mode is. –This is an aggregate estimate –These probabilities are part of the trust-model and guide resource selection in recovery and future computations. The posterior probability of each post-condition assertion –This is an aggregate estimate –This gives us an estimate of what conditions are actually true. This also guides recovery. The posterior probability of each possible attack –This implies possible compromises of other similar resources that have not yet been observed and that will also guide recovery.
55
Three Tiered Model The resource tier couples the modes of the computation tier The attack tier couples the modes of the resource tier. We have 2 tiers of common mode failures. Common mode coupling also precludes certain diagnoses on the grounds that no single attack could have caused the compromises necessary to cause the components to misbehave as observed. Program Memory
56
Summary of Diagnosis The result of Diagnosis is the construction of a Bayesian network coupling attacks, resource vulnerabilities, compromised states of the resources and finally the observed behavior of a computation. This network assigns posterior probabilities to: –Assertions modeling the state of the computation –These assertions are the prerequisite and post-conditions of the various computational steps in the plan diagram –Compromised modes of the resources used by the computation The recovery task is to find a new plan and a new set of resources that is most likely to achieve the main goal of the plan, given this updated probabilistic information about the world.
57
Example of MAF Diagnosis
58
AWDRAT R ecovery and Regeneration A daptive (Decision Theoretic) Method Selection Attack Plan Recognition Attack Plan Other Sensors: Intrusion Detectors T rust Model: Behavior Compromises Attacks D iagnosis A rchitectural Differencer System Models Wrapper Synthesizer Application Software W rapper
59
The Nature of a Trust Model Trust is a continuous, probabilistic notion –All computational resources must be considered suspect to some degree. Trust is a dynamic notion –the degree of trustworthiness may change with further compromises –the degree of trustworthiness may change with efforts at amelioration –The degree of trustworthiness may depend on the political situation and the motivation of a potential attacker Trust is a multidimensional notion –A System may be trusted to deliver a message which not being trusted to preserve its privacy. –A system may be unsafe for one user but relatively safe for another.
60
Three Tiers of a Trust Model Attack Level: history of events that suggest multi-stage attacks and intent of attackers –penetration, denial of service, unusual access, Flooding Compromise Level: state of the mechanisms that provide key properties: –Login control –Job admission control –Scheduler –Key manager –DLLs, databases, source code Trust Level: degree of confidence in key properties –Privacy: stolen passwords, stolen data, packet snooping –Integrity: parasitized, changed data, changed code –Authentication: changed keys, stolen keys –QoS: slow execution
61
How Do We Know What Attacks Are Possible? Build a Model of the Computational Environment –System Structure, Resources, Permissions Plan Against it As is You’re The Red Team Reason Abstractly –Typical Attackers –Typical Resource of Specific Type Reason about control and dependency Develop Multistage plan for compromising a typical resource
62
Modeling System Structure Hardware Processor Memory Device Controllers Devices controls Part-of Operating System Logon Controller Scheduler Device Drivers Part-of Job Admitter Resides-In controls User Set Work Load File System Access Controller resources controls files Part-of Input-to controls Scheduler Policy
63
Modeling the topology Machine name: sleepy OS Type: Windows-NT Server Suite: IIS….. User Authentication Pool: Dwarfs… Router: Enclave restrictions. …. Topology tells you: who can share (and sniff) which packets who can affect what types of connections to whom Switch: subnet restrictions. …. Switch: subnet restrictions. ….
64
Key Notions: Dependency and Control Start with the desirable properties of systems: –Reliable performance –Privacy of communications –Integrity and/or privacy of data Analyze which system components impact those properties –Performance - scheduler –Privacy - access-controller Rule 1: To affect a desirable property control a component that contributes to the delivery of that property
65
Controlling components (1) One way to gain control of a component is to directly exploit a known vulnerability –One way to control a Microsoft IIS web server is to use a buffer overflow attack on it. MAF Editor Malformed Image File Attack Takes control of Image Loading Malformed File Attack Is vulnerable to
66
Controlling components (2) Another way to control a component is to find an input to the component and then find a way to modify the input –Modify the scheduler policy parameters Scheduler Policy Parameters Input to Scheduler control by Modification- action Scheduler Policy Parameters
67
Modifying Data One way to modify data is to find a component which controls data and then to find a way to gain control of that component Scheduler Workload Input-of Scheduler control by Job AdmitterWorkload Job Admitter Controls Attack. Controls
68
Affecting Data Integrity of MAF Plan
69
AWD R AT R ecovery and Regeneration A daptive (Decision Theoretic) Method Selection Attack Plan Recognition Attack Plan Other Sensors: Intrusion Detectors T rust Model: Behavior Compromises Attacks D iagnosis A rchitectural Differencer System Models Wrapper Synthesizer Application Software W rapper
70
The Recovery Process Recovery is driven the by the Trust Assessments developed during diagnosis: –World State Can the prerequisite conditions of a method be assumed to hold and with what probability –Compromise state of resources Which resource are compromised In what way are they compromised Three core problems: –What resources to regenerate –Where to restart –How to continue after restart Regenerate if the delta in Expected Benefit is greater than the cost of regeneration
71
MAF-CAF Recovery During execution we’ve captured the execution history and the intended state of the data structures We’ve also updated the trust model based on diagnosis of the last failure Recovery can then be accomplished by restarting, replaying the history Recovery can also be accomplished by restarting and setting up the data structures to the intended state (if a complete enough trace was built) In either case, method selection will be driven by updated trust estimates
72
Client Reconstitution Demo
73
Diagnosis Attack Plan Recognitio n Trust Model: Behavior Compromises Attacks Decision Theoretic Method Selection Other Information Sources: Intrusion Detectors Attack Plans System Models Architectural Differencer Application Software Wrappers Wrapper Synthesizer Recovery and Regeneration
74
Technology Developed Java Wrappers –Windows DLL Wrappers ported to Java –Java methods wrapped with Java mediation code Architecture Differencer –Fine grained checking of MAF-CAF execution Architecture Visualizer Vulnerability Analysis for JBI Scenario Diagnosis for MAF compromises Method diversity and dynamic dispatch for image loading Data Provisioning & Client Reconstitution
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.