Example of Weighted Voting System Undersea target detection system.

Slides:



Advertisements
Similar presentations
Classification Classification Examples
Advertisements

1-Way Analysis of Variance
Risk Modeling The Tropos Approach PhD Lunch Meeting 07/07/2005 Yudistira Asnar –
Multi‑Criteria Decision Making
On-line learning and Boosting
Smooting voter : a novel voting algorithm for handling multiple errors in fault-tolerant control systems Microprocessors and Microsystems 2003 G.Latif-Shabgahi,S.Bennett,J.M.Bass.
T h e G a s L a w s. T H E G A S L A W S z B o y l e ‘ s L a w z D a l t o n ‘ s L a w z C h a r l e s ‘ L a w z T h e C o m b i n e d G a s L a w z B.
Data Mining Classification: Alternative Techniques
K Means Clustering , Nearest Cluster and Gaussian Mixture
Particle Swarm Optimization (PSO)  Kennedy, J., Eberhart, R. C. (1995). Particle swarm optimization. Proc. IEEE International Conference.
Advisor: Yeong-Sung Lin Presented by I-Ju Shih 2011/3/07 Defending simple series and parallel systems with imperfect false targets R. Peng, G. Levitin,
Overview over different methods – Supervised Learning
HardwareSoftware Success Failure Input Output. N-Version Programming Fault-Tolerant Programming Version 1 Version 2 Version N … Voter M Identical Outputs.
x – independent variable (input)
Defending Complex System Against External Impacts Gregory Levitin (IEC, UESTC)
2D1431 Machine Learning Boosting.
Classification with reject option in gene expression data Blaise Hanczar and Edward R Dougherty BIOINFORMATICS Vol. 24 no , pages
Learning From Data Chichang Jou Tamkang University.
Arizona State University DMML Kernel Methods – Gaussian Processes Presented by Shankar Bhargav.
Before we start ADALINE
Optimal Survivability Enhancement in Complex Vulnerable systems Gregory Levitin The Israel Electric Corporation Ltd.
Jacinto C. Nascimento, Member, IEEE, and Jorge S. Marques
d2d2 d2d2 p q d2d2 p q d2d2.
Review Rong Jin. Comparison of Different Classification Models  The goal of all classifiers Predicating class label y for an input x Estimate p(y|x)
Hierarchy of the Binary Models r=nr=nk=rk=r k-out-of-r-from-n:F r n Consecutive k-out-of-n k n n k-out-of-n:F.
On the Application of Artificial Intelligence Techniques to the Quality Improvement of Industrial Processes P. Georgilakis N. Hatziargyriou Schneider ElectricNational.
Chapter 5 Data mining : A Closer Look.
CSCI 347 / CS 4206: Data Mining Module 04: Algorithms Topic 06: Regression.
Computer Vision Lecture 8 Performance Evaluation.
CSCI 347 / CS 4206: Data Mining Module 06: Evaluation Topic 01: Training, Testing, and Tuning Datasets.
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
Data Mining Practical Machine Learning Tools and Techniques Chapter 4: Algorithms: The Basic Methods Section 4.6: Linear Models Rodney Nielsen Many of.
A Course In Business Statistics 4th © 2006 Prentice-Hall, Inc. Chap 9-1 A Course In Business Statistics 4 th Edition Chapter 9 Estimation and Hypothesis.
Ensemble Based Systems in Decision Making Advisor: Hsin-His Chen Reporter: Chi-Hsin Yu Date: IEEE CIRCUITS AND SYSTEMS MAGAZINE 2006, Q3 Robi.
CS 782 – Machine Learning Lecture 4 Linear Models for Classification  Probabilistic generative models  Probabilistic discriminative models.
Protection vs. false targets in series systems Reliability Engineering and System Safety(2009) Kjell Hausken, Gregory Levitin Advisor: Frank,Yeong-Sung.
SENG521 (Fall SENG 521 Software Reliability & Testing Fault Tolerant Software Systems: Techniques (Part 4b) Department of Electrical.
Detection, Classification and Tracking in a Distributed Wireless Sensor Network Presenter: Hui Cao.
Today Ensemble Methods. Recap of the course. Classifier Fusion
Ch 4. Linear Models for Classification (1/2) Pattern Recognition and Machine Learning, C. M. Bishop, Summarized and revised by Hee-Woong Lim.
Goodness-of-Fit Chi-Square Test: 1- Select intervals, k=number of intervals 2- Count number of observations in each interval O i 3- Guess the fitted distribution.
Computational Intelligence: Methods and Applications Lecture 23 Logistic discrimination and support vectors Włodzisław Duch Dept. of Informatics, UMK Google:
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
Multi-state System Element Pr{G  x} Element with total failure Element with five different performance levels g*gj4gj4 g j3 gj2gj2 gj1gj1 g j0 =0 x 1.
Machine Design Under Uncertainty. Outline Uncertainty in mechanical components Why consider uncertainty Basics of uncertainty Uncertainty analysis for.
Multi-state System (MSS) Basic Concepts MSS is able to perform its task with partial performance “all or nothing” type of failure criterion cannot be.
Heterogeneous redundancy optimization for multi-state series-parallel systems subject to common cause failures Chun-yang Li, Xun Chen, Xiao-shan Yi, Jun-youg.
Sampling and Statistical Analysis for Decision Making A. A. Elimam College of Business San Francisco State University.
Regress-itation Feb. 5, Outline Linear regression – Regression: predicting a continuous value Logistic regression – Classification: predicting a.
Logistic Regression William Cohen.
Evaluating Classifiers Reading: T. Fawcett, An introduction to ROC analysis, Sections 1-4, 7 (linked from class website)An introduction to ROC analysis.
Hypothesis Testing Steps for the Rejection Region Method State H 1 and State H 0 State the Test Statistic and its sampling distribution (normal or t) Determine.
An Efficient Rigorous Approach for Identifying Statistically Significant Frequent Itemsets Adam Kirsch, Michael Mitzenmacher, Havard University Andrea.
Giansalvo EXIN Cirrincione unit #4 Single-layer networks They directly compute linear discriminant functions using the TS without need of determining.
1 Introduction Optimization: Produce best quality of life with the available resources Engineering design optimization: Find the best system that satisfies.
Research Direction Introduction Advisor: Frank, Yeong-Sung Lin Presented by Hui-Yu, Chung 2011/11/22.
IEEE AI - BASED POWER SYSTEM TRANSIENT SECURITY ASSESSMENT Dr. Hossam Talaat Dept. of Electrical Power & Machines Faculty of Engineering - Ain Shams.
Intro. ANN & Fuzzy Systems Lecture 20 Clustering (1)
Using Asymmetric Distributions to Improve Text Classifier Probability Estimates Paul N. Bennett Computer Science Dept. Carnegie Mellon University SIGIR.
AdaBoost Algorithm and its Application on Object Detection Fayin Li.
O PTIMAL R EPLACEMENT AND P ROTECTION S TRATEGY FOR P ARALLEL S YSTEMS R UI P ENG, G REGORY L EVITIN, M IN X IE AND S ZU H UI N G Adviser: Frank, Yeong-Sung.
Lecture 1.31 Criteria for optimal reception of radio signals.
Software Development and Safety Critical Decisions
MIRA, SVM, k-NN Lirong Xia. MIRA, SVM, k-NN Lirong Xia.
Perceptrons Lirong Xia.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
Project 1 Binary Classification
Chapter 7: Introduction to Sampling Distributions
MIRA, SVM, k-NN Lirong Xia. MIRA, SVM, k-NN Lirong Xia.
Perceptrons Lirong Xia.
Presentation transcript:

Example of Weighted Voting System Undersea target detection system

Weighted Voting System - system output (0,1,x) - voting units outputs (0,1,x) d 1 (I) d 2 (I) d 3 (I) d 4 (I) d 5 (I) d 6 (I)  I D(I)D(I) w 1 w 2 w 3 w 4 w 5 w 6 unit 1unit 2 unit 3unit 4 unit 5unit 6 - threshold - system input (0,1) - weights

Decision Making Rule Total weight of units voting for the proposition acceptance Total weight of units voting for the proposition rejection System output

Decision Making Rule if (1-  )W n 1 -  W n 0 <0 Wn0Wn0  W n 1 Accept Reject if W n 1 =W n 0 =0 otherwise

WVS as a Multi-state System Voting unit j: 3 states: 4 failure modes: d j (0)=1; d j (1)=0; d j (0)=x; d j (1)=x. (1-  )W n 1 -  W n 0 Entire WVS: Multiple states characterized by different scores 3 possible outputs: 4 failure modes: D(0)=1; D(1)=0; D(0)=x; D(1)=x. Input I

Asymmetric Weighted Voting System - system output (0,1,x) - voting units outputs (0,1,x) - acceptance weights d 1 (I) d 2 (I) d 3 (I) d 4 (I) d 5 (I) d 6 (I) w 1 1 w 1 2 w 1 3 w 1 4 w 1 5 w 1 6  I D(I)D(I) w 0 1 w 0 2 w 0 3 w 0 4 w 0 5 w 0 6 unit 1unit 2 unit 3unit 4 unit 5unit 6 - threshold - system input (0,1) - rejection weights

Decision Making Rule Total weight of units voting for the proposition acceptance Total weight of units voting for the proposition rejection System output

Types of Errors d j (0)=1 (unit fails stuck-at-1) too optimistic q 01 (j) d j (1)=0 (unit fails stuck-at-0) too pessimistic q 10 (j) d j (I)=x (unit fails stuck-at-x) too indecisive q 1x (j), q 0x (j) Voting unit parameters Decision making time t j Rejection weight w j 0 Acceptance weight w j 1 System threshold  System Parameters adjustable

Universal generating function technique Score distribution for m voters Score distribution for a single voter Composition operator

)w 01,w 11,…,w 0n,w 1n,  ) = arg{R(w 01,w 11,…,w 0n,w 1n,  )  max} System Success Probability Optimal adjustment problem Optimization problems

w 1 w 4 w 5 w 2 w 6 w 3  1  2  3 22 11 00 Optimal grouping R(w, ,  )  max

VU 1VU 2VU 3VU 4VU 5VU 6 w 1 w 2 w 3 w 4 w 5 w 6  P d 1 (P) d 2 (P) d 3 (P) d 4 (P) d 5 (P) d 6 (P) D(P) PG 3PG2PG1 v Optimal distribution among protected groups

Group vulnerability M-number of groups

Order of voting decisions Total weight of units with t j  t m voting for the proposition acceptance Total weight of units with t j  t m voting for the proposition rejection t1t1 t2t2 tmtm tntn … …

Wm0Wm0  W m 0 Reject  V 1 m+1 Wm0Wm0 Accept  V 0 m+1  W m 0 Accelerated Decision Making

Q ij m probability of making the decision D(i)=j at the time t m p 0, p 1 - input distribution System reliability and expected decision time

)w 01,w 11,…,w 0n,w 1n,  ) = arg{R(w 01,w 11,…,w 0n,w 1n,  )  max} subject to T (w 01,w 11,…,w 0n,w 1n,  )  T* R T Voting system optimization problem R  max T  min Two-objective problem: Constrained problem: R  max | T<T*

Numerical Example q1xq1x q 10 q0xq0x q 01 tjtj No of unit p 0 =0.3p 0 =0.5p 0 =  Q Q Q Q R T Parameters of voting units Parameters of optimal system for T*=35 Reliability vs. expected decision time

References 1. Weighted voting systems: reliability versus rapidity, G. Levitin, Reliability Engineering & System Safety, 89(2) pp (2005). 2. Maximizing survivability of vulnerable weighted voting systems, G. Levitin, Reliability Engineering & System Safety, vol. 83, pp.17-26, (2003). 3. Threshold optimization for weighted voting classifiers, G. Levitin, Naval Research Logistics, vol. 50 (4), pp , (2003). 4. Asymmetric weighted voting systems, G. Levitin, Reliability Engineering & System Safety, vol. 76, pp , (2002). 5. Evaluating correct classification probability for weighted voting classifiers with plurality voting, G. Levitin, European Journal of Operational Research, vol. 141, pp , (2002). 6. Analysis and optimization of weighted voting systems consisting of voting units with limited availability, G. Levitin, Reliability Engineering & System Safety, vol. 73, pp , (2001). 7. Optimal unit grouping in weighted voting systems, G. Levitin, Reliability Engineering & System Safety vol. 72, pp , (2001). 8. Reliability optimization for weighted voting system, G. Levitin, A. Lisnianski, Reliability Engineering & System Safety, vol. 71, pp , (2001).