Download presentation
Presentation is loading. Please wait.
Published byGabriel Foster Modified over 8 years ago
1
Topic 11: Level 2 David L. Hall
2
Topic Objectives Introduce the concept of Level 2 processing Survey and introduce methods for approximate reasoning –Introduce concepts in probabilistic reasoning and fusion (e.g., Bayes, Dempster-Shafer, etc) Describe challenges and issues in automated reasoning Note: this topic will focus on report-level fusion; Topic 12 will introduce reasoning methods such as rule-based systems & intelligent agents
3
Level 2 Processing (Situation Refinement)
4
Level Two Processing: Situation Assessment LEVEL TWO PROCESSING SITUATION ASSESSMENT CONTEXTUAL INTERPRETATION/FUSION Environment Weather Doctrine Socio-political MULTI-PERSPECTIVE ASSESSMENT Red/Blue/White OBJECT AGGREGATION Time relationship Geometrical proximity Communications Functional dependence EVENT/ACTIVITY AGGREGATION
5
Shortfalls in L2/L3 Fusion Research From Nichols, M., “A Survey of the Current State of Data Fusion Systems”, OSD Decision Support Ctr. presentation at SPAWAR San Diego, CA, May 2000. (17 of 100 systems with any L2, L3 at all, mostly basic/simple techniques.) From Valet, L., et al: “A Statistical Overview of Recent Literature in Information Fusion”, FUSION2000, Paris, France, July 2000 (~85% of pubs reviewed at L1)
6
Hierarchy of Inference Techniques Type of InferenceApplicable Techniques - Threat Analysis - Situation Assessment - Behavior/Relationships of Entities - Identity, Attributes and Location of an Entity - Existence and Measurable Features of an Entity High Low - Knowledge-Based Techniques - Decision-Level Techniques - Estimation Techniques - Signal Processing Techniques - Expert Systems - Scripts, Frames, Templating - Case-Based Reasoning - Genetic Algorithms - Neural Nets - Cluster Algorithms - Fuzzy Logic - Bayesian Nets - Maximum A Posteriori Probability (e.g. Kalman Filters, Bayesian) - Evidential Reasoning INFERENCE LEVEL
7
Examples of Data Fusion Inferences
8
Comments on L-2 and L-3 Techniques Reasoning for level-2 and level-3 processing involves context-based reasoning and high level inferences Techniques are generally probabilistic and entail representation of uncertainty in data and inferential relationships Methods represent knowledge at the semantic level –Rule-based methods –Graphical representations –Logical templates, cases, plan hierarchies, agents & others
9
Elements of Artificial Intelligence SYMBOLIC PROCESSING Pattern MatchingInferenceSearchKnowledge Representation TECHNIQUES APPLICATION AREAS Knowledge Acquisition Automatic Programming Natural Language Processing Learning Expert Systems Intelligent Assistance Speech Understanding Text Understanding Machine Translation Computer Vision Robotics Planning
10
Challenges in Symbolic Reasoning Human Inferencing Capabilities –Continual access to multi-sensory information –Complex pattern recognition (visual, aural) –Semantic level reasoning –Knowledge of “real-world” facts, relationships, interactions –Use of heuristics for rapid assessment & decision-making –Context-based processing Computer Inferencing Challenges -Lack of real-world knowledge -Inability to deal with the perversity of English or other languages -Requirement for explicit knowledge representation & reasoning methods -Computer advantages -Processing speed & power (use of physics-based models) -Unaffected by fatigue, emotion, bias -Machine learning from large data sets
11
Categories of Representational, Decomposition Techniques
12
Major reasoning approaches Knowledge representation Rules Frames Scripts Semantic nets Parametric Templates Analogical methods Uncertainty representation Confidence factors Probability Dempster-Shafer evidential intervals Fuzzy membership functions Etc. Reasoning methods & architectures Implicit methods –Neural nets –Cluster algorithms Pattern templates –Templating methods –Case-based reasoning Process reasoning –Script interpreters –Plan-based reasoning Deductive methods –Decision-trees –Bayesian belief nets –D-S belief nets Hybrid architectures –Agent-based methods –Blackboard systems –Hybrid symbolic/numerical systems
13
Decision-level identity fusion Sensor A Sensor B Sensor N Decision-Level Identity Fusion Voting methods Bayes method Dempster- Shafer’s method Entity, target or activity being observed Declaration of Identity (sensor A) Declaration of Identity (sensor B) Declaration of Identity (sensor N) Fused Declaration of Identity In the last lecture we addressed the magic of pattern recognition! This represents a transition from Level-1 identity fusion to Level-2 fusion related to complex entities, activities, events; the reasoning is performed at the semantic (report) level.
14
Classical Statistical Inference Based on empirical probabilitiesBased on empirical probabilities Definitions:Definitions: –statistical hypothesis: statement about a population which based on information from a sample, one seeks to support or refute –statistical test: a set of rules whereby a decision on H is reached Measure of test accuracy:Measure of test accuracy: –a probability statement re: the decision when various conditions in population are true
15
Classical Statistical Inference Test logic:Test logic: Assume null hypothesis is true (H O ) Examine consequences of H O true in sampling distribution for the statistic If observations have high P of occurring, data do not contradict H O Otherwise data tend to contradict H O Level of Significance:Level of Significance: Define probability level that is considered too low to warrant support of H O if P (obs. data/ H O true) reject H O
16
Emitter Identification: Example Type 1 and Type 2 radars exist on a battlefield – These radars are known to have different PRI ability Problem: Problem: Given an observed PRI have we seen a Type 1 or Type 2 radar? ELINT COLLECTOR Forward Edge of Battle Area (FEBA) E2 E1 Note: During this presentation we will use an example of emitter identification, e. g. for situation assessment related to a DoD problem. However this can be translated directly into other applications such as medical tests, enfironmental monitoring, or monitoring complex machines;
17
Classical Inference for Identity Declaration: Example Probability density function Pulse repetition-interval (PRI) PRI N PRI N+1 E1 (Radar Class 1) E2 (Radar Class 2) A measure of the probability that radar class 2 will use a PRI in the interval PRI N ≤ PRI ≤ PRI N+1 Probability density function E1 (Radar Class 1) E2 (Radar Class 2) PRI PRI C
18
Issues with Classical Probability Requires knowledge of a priori probability density functions (Usually) applied to a hypothesis and it’s alternate Does not account for a priori information about the “likelihood in nature” of a hypothesis being true (Strictly speaking) classical probability can only be used with repeatable experiments
19
Bayesian Inference Can be used on subjective probabilities – D oes not necessarily require sampling, etc. Statement: If H 1, H 2 --- H i represent mutually exclusive and exhaustive hypotheses which can explain an event, E, that has just occurred Then And Nomenclature – P(H i /E) = a posteriori probability of H i true given E – P(H i ) = a priori probability – P(E/H i ) = probability of E given H i true P(H i /E) = P(E/H i ) P(H i ) P(E/H i ) P(H i ) i P(H i ) = 1 exhaustivity i
20
Bayes Form: Impact on the Examples of Emitter Identification P(EMITTER X PRI 0 ) = P(PRI 0 EMITTER X) P(EMITTER X) P(PRI 0 ) EMITTER i ) P(EMITTER i ) i In the case of multiple measurements, Does not require sampling distributions – Analyst can estimate P(PRI/E) – Analysts can include any knowledge they have pertaining to the relative numbers of emitters P(EMITTER X PRI 0 and F 0 ) = P(EMITTER X ) P(PRI 0 EM X) P(F 0 EM and PRI 0 ) P(EMITTER X) ------------ P(EMITTER Y) ------------ P(EMITTER Z) ------------
21
Bayes Form: Impact on the Examples of Emitter Identification P(EMITTER X PRI 0 ) = P(PRI 0 EMITTER X) P(EMITTER X) P(PRI 0 ) EMITTER i ) P(EMITTER i ) i In the case of multiple measurements, Does not require sampling distributions – Analyst can estimate P(PRI/E) – Analysts can include any knowledge they have pertaining to the relative numbers of emitters P(EMITTER X PRI 0 ∩F 0 ) = P(EMITTER X ) P(PRI 0 EM X) P(F 0 EM ∩ PRI 0 ) P(EMITTER X) ------------ P(EMITTER Y) ------------ P(EMITTER Z) ------------
22
Concept of Identity Declaration by a Sensor Major Development Issue: Major Development Issue: ability to model/establish P(D/T) as a function of range, SNR, etc. Note: Note: Columns of declaration matrix are mutually exclusive, exhaustive hypotheses that explain an observation
23
Identification, Friend, Foe Neutral (IFFN) Bayesian Example [P(D i /O j )]i = 1, n j = fixed Based on empirical probabilities derived from tests, we have for a sensor Then j P(D i /O j ) = 1 Sensor 1 Sensor 2 Sensor 3 Then construct probability matrix for each sensor [P(Di/Oj)] D D1 D2 D3 n 1 2 3 --- H NOTES: 1. Column SUM =1 Row SUM 1 2. n (not necessarily) = m
24
IFFN Bayesian Example continued Individual sensors provide (via a declaration matrix), P(D i O j ) –Bayes rule allows conversion of P(D i O j ) to P(O j D i ) –Multiple sensors provide: {P(O 1 D 1 ), P(O 2 D 1 ), … P(O j D 1 ) …} FROM SENSOR 1 {P(O 1 D 2 ), P(O 2 D 2 ), … P(O j D 2 ) …} FROM SENSOR 2 Given multiple evidence (observations) Bayes rule allows fusion of declarations for each object (i.e., hypothesis) P(O j D 1 D 2 ...) = P(O j )[P(D 1 O j ) P(D 2 O j ) … P(D k O j ) … ] P(O j )[P(D 1 O j ) P(D 2 O i ) P(D 2 O j ) … P(D n O i ) ] i
25
Summary of Bayesian Fusion for Identity MAP Threshold MAP etc. Decision Logic SENSOR #1 Observables Classifier Declaration P(D 1 O j ) D1D1 SENSOR #2 ETC. P(D 2 O j ) D2D2 SENSOR #n ETC. P(D n O j ) Transformation from observation space to declaration Uncertainty in declaration as expressed in a declaration matrix Fused probability of object j, given D1, D2 …, Dn Bayesian Combination Formula P(O j D 1 D 2 D n ) j = 1, … M Fused Identity Declaration Select highest value of P(O j ) DnDn
26
Bayesian Inference The good news –Allows incorporation of a priori information about P(H i ) –Allows utilization of subjective probability –Allows iterative update –Intuitive formulation The bad news –Requires specification of “priors” –Requires identification of exhaustive set of alternative hypotheses, H –Can become complex for dependent evidence –May produce “idiot” Bayes result
27
Dempster-Shafer Theory Arthur Dempster (1968):Arthur Dempster (1968): Generalization of the Bayesian approach Glen Shafer (1976):Glen Shafer (1976): Mathematical theory of evidence Basic issue:Basic issue: Manner in which belief, derived evidence, is distributed over propositions (hypotheses) EVIDENCE BELIEF (OVER) DISTRIBUTION OF BELIEF PROPOSITIONS ABOUT THE EXHAUSTIVE POSSIBILITIES IN A DOMAIN
28
Distribution of Belief Operational ModeOperational Mode –Humans seemingly distribute belief (based on evidence) in a fragmentary way, thus, in general, for evidence, E, and propositions, A, B, C, we will have: M(A) = measure of belief that E supports A exactly M(AB) = measure of belief assigned to the disjunction, which includes A etc.
29
Probabilities for propositions are induced by the mass distributionProbabilities for propositions are induced by the mass distribution Bayesian mass distributions assign only to the set of single mutually exclusive and exhaustive propositions (in only).Bayesian mass distributions assign only to the set of single mutually exclusive and exhaustive propositions (in only). With the D-S approach, belief can be assigned to a set of propositions that need not be mutually exclusive. This leads to the motion ofWith the D-S approach, belief can be assigned to a set of propositions that need not be mutually exclusive. This leads to the motion of evidential interval Probability and Belief [SPT(A), PLS(A)] PR(A) + PR(~A) = 1 PR(A) = M( , 2 ) A A
30
Example of probability mass assignment A single dice can show one of six observable faces (these are the mutually exclusive and exhaustive hypotheses) –The number showing on the dice is 1 2 3 4 5 6 Propositions can include; –The number showing on the dice is even –The number showing on the dice is 1 or 3 –… –The number showing on the dice is 1 or 2 or 3 or 4 or 5 or 6 (the “I don’t know” proposition); The set of hypotheses is; Θ = {1,2,3,4,5,6} The set of propositions is; 2 Θ = {1, 2,3,4,5,6, 1 or 2, 1 or 3, 2 or 3, 3 or 4, ……. 1 or 2 or 3 or 4 or 5 or 6}
31
Probability and Belief Formulae PLS(A) = 1 - SPT(~A) = 1 - M( , 2 ) A A SPT(A) = M( , 2 ) A A andSPT(A) + SPT(~A) 1 – Uncertainty (A) = PLS(A) - SPT(A) – If for all A, U(A) = 0 Bayesian Adapted from Greer, Thomas H., “Artificial Intelligence: A New Dimension in EW”, Defense Electronics, October, 1985, pp. 108-128.
32
Support and Plausibility SupportSupport –The degree to which the evidence supports the proposition –The sum of the probability masses for a proposition and its subjects PlausibilityPlausibility –The extent to which the evidence fails to refute a proposition –P(A) = 1 - S(~A) = 1 ExamplesExamples –A(0,0) S(A) = 0 no supporting evidence – P(A) = 0 S(~A) = 1 evidence refutes A –A(.25,.85) S(A) =.25 S(~A) =.15 Plausibility Evidential Interval Refuting Evidence Supporting Evidence.25.85 Adapted from Greer, Thomas H., “Artificial Intelligence: A New Dimension in EW”, Defense Electronics, October, 1985, pp. 108-128.
33
Dempster-Shafer Example D-S Threat Warning Sensor (TWS) RF PRF TWS Belief Distribution Belief Distribution SAM-X 0.3 SAM-X, TT R 0.4 SAM-X, ACQ 0.2 UNKNOWN 0.1 Then, the evidential intervals are; SPT (SAM-X, TT R ) = 0.4 PLS (SAM-X, TT R ) = 1-SPT (SAM-X, TT R ) = 1-SPT (SAM-X, ACQ) = 1-0.2 = 0.8 (SAM-X, TT R ) = [0.4, 0.8] Similarly, (SAM-X) = [0.9, 1.0] (SAM-X, ACQ) = [0.2, 0.6] Adapted from Greer, Thomas H., “Artificial Intelligence: A New Dimension in EW”, Defense Electronics, October, 1985, pp. 108-128.
34
Composite Uncertainty: Two Source Example
35
Dempster Rules of Combination 1The product of mass assignments to two propositions that are consistent leads to another proposition contained within the original (e.g., m 1 (a 1 )m 2 (a 1 ) = m(a 1 )). 2Multiplying the mass assignment to uncertainty by the mass assignment to any other proposition leads to a contribution to that proposition (e.g., m 1 ()m 2 (a 2 ) = m(a 2 )). 3Multiplying uncertainty by uncertainty leads to a new assignment to uncertainty (e.g, m 1 ()m 2 () = m()). 4When inconsistency occurs between knowledge sources, assign a measure of inconsistency denoted k to their products (e.g., m 1 (a 1 )m 2 (a 1 ) = k).
36
Composite Uncertainty: Computing Belief Distributions for Pooled Evidence 1. Compute Credibility Intervals 2. Map the Mass of Belief Distribution 3. Compute Composite Beliefs SOURCE 1 Compute all Credibility Intervals SOURCE 2 Compute all Credibility Intervals ABCDABCD SOURCE 1 K = measure of all mass associated with conflicting reports K = (.2 x.2)+(.4 x.2)+(.3 x.2) = 0.18 1-K = 1 - 0.18 = 0.82 For each proposition, sum all of the masses that support the proposition and divide by 1-K: SAM-Y TT = A + B + C + D 1 - K = 0.49 SOURCE 2 Compute the credibility intervals for pooled evidence Apply Decision Rule
37
Pooled Evidence
38
Summary of Dempster-Shafer Fusion for Identity Fused probability mass for each object hypothesis, Oj General level of uncertainty leading to Combined evidential intervals Transformation from observation space to mass distributions mi(Oj) Mi/Oj SENSOR #1 Observables Classifier Declaration SENSOR #2 ETC. SENSOR #n ETC. Compute or enumerate mass distribution for given declaration ETC. Combine/Fuse Mass Distributions via Dempster’s Rules of Combination M(Oj) = F(mi(Oj)) Combine/Fuse Mass Distributions via Dempster’s Rules of Combination M(Oj) = F(mi(Oj)) Decision Logic Fused Identity Declaration Select best combined evidential interval
39
Dempster-Shafer Inference The good news –Allows incorporation of a priori information about hypotheses and propositions –Allows utilization of subjective evidence –Allows assignment of general level of uncertainty –Allows iterative update The bad news –Requires specification of “priors” –Not “intuitive” –May result in “weak” decisions –More computationally demanding than Bayes –Can become complex for dependent evidence –May produce “idiot” D-S result
40
Decision Logic SENSOR #1 Observables Classifier Declaration Weight for sensor, S 1 D1D1 SENSOR #2 ETC. Weight for sensor S 2 D2D2 SENSOR #n ETC. Weight for sensor S N Transform from observation space to declaration Uncertainty in declaration as expressed in a weight or confidence for each sensor Fused decision via weighted voting formula Voting Combination Formula Fused Identity Declaration Select highest vote (majority, plurality, etc) DnDn Summary of Voting for Identity Estimation V(O j ) = w i δ i (O j )
41
Some Decision Fusion Techniques
42
Topic 11 Assignments Preview the on-line topic 11 materials Read chapter 7 of Hall and McMullen (2004) Writing assignment 8: One page description of how the level 2 process applies to your selected application Discussion 5: Discuss the challenges of designing and implementing a data fusion system both in general and for your selected application.
43
Data Fusion Tip of the Week Young researchers in automated reasoning and artificial intelligence wax enthusiastically about the power of computers and their potential to automate human-like reasoning processes (saying in effect “aren’t computers wonderful!”) Later in their careers these same researchers admit that it is a very difficult problem and believe they could make significant progress with increased computer speed and memory Still later, these researchers realize the complexities of the problem and praise human reasoning (saying in effect, “aren’t humans wonderful!”)
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.