“Victor Babes” UNIVERSITY OF MEDICINE AND PHARMACY TIMISOARA DEPARTMENT OF MEDICAL INFORMATICS AND BIOPHYSICS Medical Informatics Division www.medinfo.umft.ro/dim.

Slides:



Advertisements
Similar presentations
Applications of one-class classification
Advertisements

Chapter 4 Pattern Recognition Concepts: Introduction & ROC Analysis.
Clustering: Introduction Adriano Joaquim de O Cruz ©2002 NCE/UFRJ
Pattern Recognition Ku-Yaw Chang
FT228/4 Knowledge Based Decision Support Systems
Rule Based Systems Alford Academy Business Education and Computing
“Victor Babes” UNIVERSITY OF MEDICINE AND PHARMACY TIMISOARA DEPARTMENT OF MEDICAL INFORMATICS AND BIOPHYSICS Medical Informatics Division
Lecture 20 Object recognition I
Chapter 5: Linear Discriminant Functions
Chapter 4 Probability.
Chapter 2: Pattern Recognition
Chapter 4 Basic Probability
Data-intensive Computing Algorithms: Classification Ref: Algorithms for the Intelligent Web 6/26/20151.
Statistics for Managers Using Microsoft Excel, 5e © 2008 Pearson Prentice-Hall, Inc.Chap 4-1 Statistics for Managers Using Microsoft® Excel 5th Edition.
Pattern Recognition. Introduction. Definitions.. Recognition process. Recognition process relates input signal to the stored concepts about the object.
MACHINE LEARNING. What is learning? A computer program learns if it improves its performance at some task through experience (T. Mitchell, 1997) A computer.
Chapter 4 Basic Probability
Copyright ©2011 Pearson Education 4-1 Chapter 4 Basic Probability Statistics for Managers using Microsoft Excel 6 th Global Edition.
Chapter 4 Basic Probability
Introduction to machine learning
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Lecture Notes by Neşe Yalabık Spring 2011.
Data Mining Techniques
Methods in Medical Image Analysis Statistics of Pattern Recognition: Classification and Clustering Some content provided by Milos Hauskrecht, University.
嵌入式視覺 Pattern Recognition for Embedded Vision Template matching Statistical / Structural Pattern Recognition Neural networks.
Intrusion Detection Jie Lin. Outline Introduction A Frame for Intrusion Detection System Intrusion Detection Techniques Ideas for Improving Intrusion.
Copyright R. Weber Machine Learning, Data Mining ISYS370 Dr. R. Weber.
“Victor Babes” UNIVERSITY OF MEDICINE AND PHARMACY TIMISOARA DEPARTMENT OF MEDICAL INFORMATICS AND BIOPHYSICS Medical Informatics Division
Lecture 2: Bayesian Decision Theory 1. Diagram and formulation
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
Copyright ©2014 Pearson Education Chap 4-1 Chapter 4 Basic Probability Statistics for Managers Using Microsoft Excel 7 th Edition, Global Edition.
Introduction to machine learning and data mining 1 iCSC2014, Juan López González, University of Oviedo Introduction to machine learning Juan López González.
Business Statistics: A First Course, 5e © 2009 Prentice-Hall, Inc. Chap 4-1 Chapter 4 Basic Probability Business Statistics: A First Course 5 th Edition.
Image Classification 영상분류
Medical Bioinformatics Prof:Rui Alves Dept Ciencies Mediques Basiques, 1st.
Biostatistics, statistical software VII. Non-parametric tests: Wilcoxon’s signed rank test, Mann-Whitney U-test, Kruskal- Wallis test, Spearman’ rank correlation.
1 Learning Chapter 18 and Parts of Chapter 20 AI systems are complex and may have many parameters. It is impractical and often impossible to encode all.
Big Ideas Differentiation Frames with Icons. 1. Number Uses, Classification, and Representation- Numbers can be used for different purposes, and numbers.
Today Ensemble Methods. Recap of the course. Classifier Fusion
Chapter 4 Probability ©. Sample Space sample space.S The possible outcomes of a random experiment are called the basic outcomes, and the set of all basic.
Technological Educational Institute Of Crete Department Of Applied Informatics and Multimedia Intelligent Systems Laboratory Statistical Decision Making.
“Victor Babes” UNIVERSITY OF MEDICINE AND PHARMACY TIMISOARA DEPARTMENT OF MEDICAL INFORMATICS AND BIOPHYSICS Medical Informatics Division
Dr. Chen, Data Mining  A/W & Dr. Chen, Data Mining Part I Data Mining Fundamentals Chapter 1 Data Mining: A First View Jason C. H. Chen, Ph.D. Professor.
Basic Business Statistics Assoc. Prof. Dr. Mustafa Yüzükırmızı
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Lecture notes 9 Bayesian Belief Networks.
Decision Trees Binary output – easily extendible to multiple output classes. Takes a set of attributes for a given situation or object and outputs a yes/no.
1Ellen L. Walker Category Recognition Associating information extracted from images with categories (classes) of objects Requires prior knowledge about.
Confidence Intervals for Variance and Standard Deviation.
Covariance matrices for all of the classes are identical, But covariance matrices are arbitrary.
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc. Chap 4-1 Chapter 4 Basic Probability Basic Business Statistics 11 th Edition.
Clinical Decision Support 1 Historical Perspectives.
Chapter 20 Classification and Estimation Classification – Feature selection Good feature have four characteristics: –Discrimination. Features.
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Part 9: Review.
Foundational Issues Machine Learning 726 Simon Fraser University.
Chapter 8 – Naïve Bayes DM for Business Intelligence.
Business Statistics: A First Course, 5e © 2009 Prentice-Hall, Inc. Chap 4-1 Chapter 4 Basic Probability Business Statistics: A First Course 5 th Edition.
Computational Intelligence: Methods and Applications Lecture 26 Density estimation, Expectation Maximization. Włodzisław Duch Dept. of Informatics, UMK.
SUPERVISED AND UNSUPERVISED LEARNING Presentation by Ege Saygıner CENG 784.
Brief Intro to Machine Learning CS539
Data-intensive Computing Algorithms: Classification
IMAGE PROCESSING RECOGNITION AND CLASSIFICATION
Chapter 4 Basic Probability.
“Victor Babes” UNIVERSITY OF MEDICINE AND PHARMACY TIMISOARA
Component 11: Configuring EHRs
Single Variable Data Analysis
What is Pattern Recognition?
Introduction to Statistics
Pattern Recognition and Image Analysis
3.1.1 Introduction to Machine Learning
The Naïve Bayes (NB) Classifier
Hairong Qi, Gonzalez Family Professor
Presentation transcript:

“Victor Babes” UNIVERSITY OF MEDICINE AND PHARMACY TIMISOARA DEPARTMENT OF MEDICAL INFORMATICS AND BIOPHYSICS Medical Informatics Division / 2005

MEDICAL DECISION SUPPORT (I) COURSE 11

1. MEDICAL DECISION 1.1. DIRECTIONS:1.1. DIRECTIONS: –COMPUTER ASSISTED DIAGNOSIS –INVESTIGATION SELECTION –THERAPY OPTIMISATION –HEALTHCARE MANAGEMENT

1.2. ELEMENTARY CYCLE OF MEDICAL ACTIVITY

1.2. METHOD CLASSIFICATION1.2. METHOD CLASSIFICATION –a) LOGICAL TRUTH (SYMPTOM) TABLESTRUTH (SYMPTOM) TABLES DECISION TREESDECISION TREES –b) STATISTICAL BAYES’ RULEBAYES’ RULE PATTERN RECOGNITIONPATTERN RECOGNITION –c) HEURISTICAL EXPERT SYSTEMSEXPERT SYSTEMS

2. LOGICAL METHODS 2.1. CONSTRUCTIVE PRINCIPLES2.1. CONSTRUCTIVE PRINCIPLES a)Based on bivalent logics: Knowledge represented symbolically by Yes/No (1/0) Yes/No (1/0) b) Knowledge Base = Symptom table c) Patient ’STATE VECTOR’ (PAT) d) Sequential comparison e) Diagnoses list (sorted)

D=disease, S=symptom PAT=patient state vector S 1 S 2 S Score D /8 D / PAT 0 1 0

2.2. Types of logical methods According to PAC vector construction: A) Symptom tables (truth tables)A) Symptom tables (truth tables) –Symptom selection from a menu B) Decision treesB) Decision trees –Set of questions with Y/N answers –Avoiding useless questions –Patient involvement

2.3. DISADVANTAGES OF LOGICAL METHODS CANNOT QUANTIFY SYMPTOM INTENSITY (ex: high/moderate fever)CANNOT QUANTIFY SYMPTOM INTENSITY (ex: high/moderate fever) CONSIDER ALL SYMPTOMS AS EQUALLY WEIGHTED FOR DIAGNOSISCONSIDER ALL SYMPTOMS AS EQUALLY WEIGHTED FOR DIAGNOSIS SOME SYMPTOMS MIGHT NOT BE PRESENTSOME SYMPTOMS MIGHT NOT BE PRESENT DISREGARD DISEASE PREVALENCEDISREGARD DISEASE PREVALENCE

3. STATISTICAL METHODS 3.1. BAYES’ RULE assumes as known :3.1. BAYES’ RULE assumes as known : –p(D+) : Probability of disease D within a population (disease probability ~ prevalence) –p(S+/D+) : Probability of symptom S to occur IF the disease D is present –[ p(S+) : probability to encounter symptom S, which may be computed or estimated from p(S+) and p(S+/D+) ]

b) For each pair D/S (Disease / Symptom) we build a 2x2 table S+S - D+ n 11 n 12 R 1 D - n 21 n 22 R 2 C 1 C 2 N

c) PROBABILITIES: - unconditional: p ( D+) = R 1 / N [probability of disease D to be present] - conditional: p ( S+ / D- ) = n 21 / R 2 [probability of symptom S to be present if disease D is absent] d) BAYES RULE P(B/S) = P(S/B) x P(B) P(S)

e) Importance: - possibility to compute p(D+ / S+) without the table f) For several symptoms: - it can be applied only for INDEPENDENT symptoms -independence test (chi-square) g) Application P(S+/D+) = n 11 /R 1 P(D+) = R 1 /N P(S+) = C 1 /N => P(D+/S+) = n 11 /C 1 => P(D+/S+) = n 11 /C 1

Example In a study 3000 medical records are analyzed. 500 patients had virosis and 400 of them presented fever. But fever has been present also in other 600 patients. Calculate: A) probability that a patient had virosis B) … … did not have fever C) … … had fever if did not have virosis D) … … had virosis if had fever

3.2. PATTERN RECOGNITION Establishing a diagnosis as a RECOGNITION processEstablishing a diagnosis as a RECOGNITION process Notion of PATTERNNotion of PATTERN –NAME OF A CLASS DISTINGUISHED BY A SET OF CHARACTERISTIC FEATURES Discriminant power of various featuresDiscriminant power of various features Process of RECOGNITIONProcess of RECOGNITION

c) CLASSIFICATION METHOD a set of classified objects is given (with their [numerical] characteristic features)a set of classified objects is given (with their [numerical] characteristic features) a representation in a multidimensional space is made and each class is delimiteda representation in a multidimensional space is made and each class is delimited Question: to which class does a new object belongQuestion: to which class does a new object belong the new object is classified according to its positionthe new object is classified according to its position two working phases:two working phases: –learning (supervised) –classification advantage: similarity with real situationsadvantage: similarity with real situations

d) CLUSTERING METHOD a (large) set of unclassified objects is givena (large) set of unclassified objects is given an n-dimensional graphical representation is performedan n-dimensional graphical representation is performed Question: can these objects be divided into different classes ?Question: can these objects be divided into different classes ? two phases:two phases: –learning (unsupervised) –classification (cluster defining)

e) BUILDING A PATTERN FEATURE SELECTIONFEATURE SELECTION –class delimitation (disjunct classes) –projection function –methods: - vectorial: principal components analysisprincipal components analysis discriminant analysisdiscriminant analysis - structural: - structural: feature hierarchyfeature hierarchy ‘CLASSIFIER’ SYNTHESIS (decision function)‘CLASSIFIER’ SYNTHESIS (decision function) –geometrical / statistical / syntactic rules

E n d