“Victor Babes” UNIVERSITY OF MEDICINE AND PHARMACY TIMISOARA DEPARTMENT OF MEDICAL INFORMATICS AND BIOPHYSICS Medical Informatics Division www.medinfo.umft.ro/dim.

Slides:



Advertisements
Similar presentations
Applications of one-class classification
Advertisements

Chapter 4 Pattern Recognition Concepts: Introduction & ROC Analysis.
Clustering: Introduction Adriano Joaquim de O Cruz ©2002 NCE/UFRJ
Pattern Recognition Ku-Yaw Chang
FT228/4 Knowledge Based Decision Support Systems
Bayesian Decision Theory
Rule Based Systems Alford Academy Business Education and Computing
“Victor Babes” UNIVERSITY OF MEDICINE AND PHARMACY TIMISOARA DEPARTMENT OF MEDICAL INFORMATICS AND BIOPHYSICS Medical Informatics Division
Diagnosing – Critical Activity HINF Medical Methodologies Session 7.
Lecture 20 Object recognition I
Chapter 5: Linear Discriminant Functions
Chapter 2: Pattern Recognition
Chapter 4 Basic Probability
Machine Vision and Dig. Image Analysis 1 Prof. Heikki Kälviäinen C50A6100 Lectures 12: Object Recognition Professor Heikki Kälviäinen Machine Vision and.
Data-intensive Computing Algorithms: Classification Ref: Algorithms for the Intelligent Web 6/26/20151.
Statistics for Managers Using Microsoft Excel, 5e © 2008 Pearson Prentice-Hall, Inc.Chap 4-1 Statistics for Managers Using Microsoft® Excel 5th Edition.
Pattern Recognition. Introduction. Definitions.. Recognition process. Recognition process relates input signal to the stored concepts about the object.
Learning Programs Danielle and Joseph Bennett (and Lorelei) 4 December 2007.
MACHINE LEARNING. What is learning? A computer program learns if it improves its performance at some task through experience (T. Mitchell, 1997) A computer.
Chapter 4 Basic Probability
Copyright ©2011 Pearson Education 4-1 Chapter 4 Basic Probability Statistics for Managers using Microsoft Excel 6 th Global Edition.
Chapter 4 Basic Probability
Introduction to machine learning
Data Mining By Andrie Suherman. Agenda Introduction Major Elements Steps/ Processes Tools used for data mining Advantages and Disadvantages.
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Lecture Notes by Neşe Yalabık Spring 2011.
Data Mining Techniques
Methods in Medical Image Analysis Statistics of Pattern Recognition: Classification and Clustering Some content provided by Milos Hauskrecht, University.
嵌入式視覺 Pattern Recognition for Embedded Vision Template matching Statistical / Structural Pattern Recognition Neural networks.
Copyright R. Weber Machine Learning, Data Mining ISYS370 Dr. R. Weber.
Data Mining Joyeeta Dutta-Moscato July 10, Wherever we have large amounts of data, we have the need for building systems capable of learning information.
“Victor Babes” UNIVERSITY OF MEDICINE AND PHARMACY TIMISOARA DEPARTMENT OF MEDICAL INFORMATICS AND BIOPHYSICS Medical Informatics Division
Lecture 2: Bayesian Decision Theory 1. Diagram and formulation
Copyright ©2014 Pearson Education Chap 4-1 Chapter 4 Basic Probability Statistics for Managers Using Microsoft Excel 7 th Edition, Global Edition.
Introduction to machine learning and data mining 1 iCSC2014, Juan López González, University of Oviedo Introduction to machine learning Juan López González.
Business Statistics: A First Course, 5e © 2009 Prentice-Hall, Inc. Chap 4-1 Chapter 4 Basic Probability Business Statistics: A First Course 5 th Edition.
Previous Lecture: Data types and Representations in Molecular Biology.
Medical Bioinformatics Prof:Rui Alves Dept Ciencies Mediques Basiques, 1st.
Big Ideas Differentiation Frames with Icons. 1. Number Uses, Classification, and Representation- Numbers can be used for different purposes, and numbers.
Today Ensemble Methods. Recap of the course. Classifier Fusion
“Victor Babes” UNIVERSITY OF MEDICINE AND PHARMACY TIMISOARA DEPARTMENT OF MEDICAL INFORMATICS AND BIOPHYSICS Medical Informatics Division
Technological Educational Institute Of Crete Department Of Applied Informatics and Multimedia Intelligent Systems Laboratory Statistical Decision Making.
Dr. Chen, Data Mining  A/W & Dr. Chen, Data Mining Part I Data Mining Fundamentals Chapter 1 Data Mining: A First View Jason C. H. Chen, Ph.D. Professor.
Basic Business Statistics Assoc. Prof. Dr. Mustafa Yüzükırmızı
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Lecture notes 9 Bayesian Belief Networks.
1Ellen L. Walker Category Recognition Associating information extracted from images with categories (classes) of objects Requires prior knowledge about.
Confidence Intervals for Variance and Standard Deviation.
Covariance matrices for all of the classes are identical, But covariance matrices are arbitrary.
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc. Chap 4-1 Chapter 4 Basic Probability Basic Business Statistics 11 th Edition.
Clinical Decision Support 1 Historical Perspectives.
Chapter 20 Classification and Estimation Classification – Feature selection Good feature have four characteristics: –Discrimination. Features.
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Part 9: Review.
Foundational Issues Machine Learning 726 Simon Fraser University.
Chapter 8 – Naïve Bayes DM for Business Intelligence.
Business Statistics: A First Course, 5e © 2009 Prentice-Hall, Inc. Chap 4-1 Chapter 4 Basic Probability Business Statistics: A First Course 5 th Edition.
Computational Intelligence: Methods and Applications Lecture 26 Density estimation, Expectation Maximization. Włodzisław Duch Dept. of Informatics, UMK.
SUPERVISED AND UNSUPERVISED LEARNING Presentation by Ege Saygıner CENG 784.
Network Management Lecture 13. MACHINE LEARNING TECHNIQUES 2 Dr. Atiq Ahmed Université de Balouchistan.
Brief Intro to Machine Learning CS539
Unit 1 Section 1.3.
Data-intensive Computing Algorithms: Classification
IMAGE PROCESSING RECOGNITION AND CLASSIFICATION
Chapter 4 Basic Probability.
“Victor Babes” UNIVERSITY OF MEDICINE AND PHARMACY TIMISOARA
Component 11: Configuring EHRs
What is Pattern Recognition?
Introduction to Statistics
Pattern Recognition and Image Analysis
3.1.1 Introduction to Machine Learning
The Naïve Bayes (NB) Classifier
Low-Rank Sparse Feature Selection for Patient Similarity Learning
Presentation transcript:

“Victor Babes” UNIVERSITY OF MEDICINE AND PHARMACY TIMISOARA DEPARTMENT OF MEDICAL INFORMATICS AND BIOPHYSICS Medical Informatics Division / 2008

MEDICAL DECISION SUPPORT (I) COURSE 11

1. MEDICAL DECISION 1.1. DIRECTIONS:1.1. DIRECTIONS: –COMPUTER ASSISTED DIAGNOSIS –INVESTIGATION SELECTION –THERAPY OPTIMISATION –HEALTHCARE MANAGEMENT

a) COMPUTER ASSISTED DIAGNOSIS History

c) INVESTIGATION SELECTION Investigations might be: §Invasive §Expensive

c) Therapy optimization a)Tumor treatment by radiations b)Drug treatment

d) HEALTHCARE MANAGEMENT a)Planning and use of resources b)Optimization

1.2. ELEMENTARY CYCLE OF MEDICAL ACTIVITY

1.2. METHOD CLASSIFICATION1.2. METHOD CLASSIFICATION –a) LOGICAL TRUTH (SYMPTOM) TABLESTRUTH (SYMPTOM) TABLES DECISION TREESDECISION TREES –b) STATISTICAL BAYES’ RULEBAYES’ RULE PATTERN RECOGNITIONPATTERN RECOGNITION –c) HEURISTICAL EXPERT SYSTEMSEXPERT SYSTEMS

2. LOGICAL METHODS 2.1. CONSTRUCTIVE PRINCIPLES2.1. CONSTRUCTIVE PRINCIPLES a)Based on bivalent logics: Knowledge represented symbolically by Yes/No (1/0) Yes/No (1/0) b) Knowledge Base = Symptom table c) Patient ’STATE VECTOR’ (PAT) d) Sequential comparison e) Diagnoses list (sorted)

D=disease, S=symptom PAT=patient state vector S 1 S 2 S Score D /8 D / PAT 0 1 0

2.2. Types of logical methods According to PAC vector construction: A) Symptom tables (truth tables)A) Symptom tables (truth tables) –Symptom selection from a menu B) Decision treesB) Decision trees –Set of questions with Y/N answers –Avoiding useless questions –Patient involvement

2.3. DISADVANTAGES OF LOGICAL METHODS CANNOT QUANTIFY SYMPTOM INTENSITY (ex: high/moderate fever)CANNOT QUANTIFY SYMPTOM INTENSITY (ex: high/moderate fever) CONSIDER ALL SYMPTOMS AS EQUALLY WEIGHTED FOR DIAGNOSISCONSIDER ALL SYMPTOMS AS EQUALLY WEIGHTED FOR DIAGNOSIS SOME SYMPTOMS MIGHT NOT BE PRESENTSOME SYMPTOMS MIGHT NOT BE PRESENT DISREGARD DISEASE PREVALENCEDISREGARD DISEASE PREVALENCE

3. STATISTICAL METHODS 3.1. BAYES’ RULE assumes as known :3.1. BAYES’ RULE assumes as known : –p(D+) : Probability of disease D within a population (disease probability ~ prevalence) –p(S+/D+) : Probability of symptom S to occur IF the disease D is present –[ p(S+) : probability to encounter symptom S, which may be computed or estimated from p(S+) and p(S+/D+) ]

b) For each pair D/S (Disease / Symptom) we build a 2x2 table S+S - D+ n 11 n 12 R 1 D - n 21 n 22 R 2 C 1 C 2 N

c) PROBABILITIES: - unconditional: p ( D+) = R 1 / N [probability of disease D to be present] - conditional: p ( S+ / D- ) = n 21 / R 2 [probability of symptom S to be present if disease D is absent] d) BAYES RULE P(B/S) = P(S/B) x P(B) P(S)

e) Importance: - possibility to compute p(D+ / S+) without the table f) For several symptoms: - it can be applied only for INDEPENDENT symptoms -independence test (chi-square) g) Application P(S+/D+) = n 11 /R 1 P(D+) = R 1 /N P(S+) = C 1 /N => P(D+/S+) = n 11 /C 1 => P(D+/S+) = n 11 /C 1

Example In a study 3000 medical records are analyzed. 500 patients had virosis and 400 of them presented fever. But fever has been present also in other 600 patients. Calculate: A) probability that a patient had virosis B) … … did not have fever C) … … had fever if did not have virosis D) … … had virosis if had fever

3.2. PATTERN RECOGNITION Establishing a diagnosis as a RECOGNITION processEstablishing a diagnosis as a RECOGNITION process Notion of PATTERNNotion of PATTERN –NAME OF A CLASS DISTINGUISHED BY A SET OF CHARACTERISTIC FEATURES Discriminant power of various featuresDiscriminant power of various features Process of RECOGNITIONProcess of RECOGNITION

c) CLASSIFICATION METHOD a set of classified objects is given (with their [numerical] characteristic features)a set of classified objects is given (with their [numerical] characteristic features) a representation in a multidimensional space is made and each class is delimiteda representation in a multidimensional space is made and each class is delimited Question: to which class does a new object belongQuestion: to which class does a new object belong the new object is classified according to its positionthe new object is classified according to its position two working phases:two working phases: –learning (supervised) –classification advantage: similarity with real situationsadvantage: similarity with real situations

d) CLUSTERING METHOD a (large) set of unclassified objects is givena (large) set of unclassified objects is given an n-dimensional graphical representation is performedan n-dimensional graphical representation is performed Question: can these objects be divided into different classes ?Question: can these objects be divided into different classes ? two phases:two phases: –learning (unsupervised) –classification (cluster defining)

e) BUILDING A PATTERN FEATURE SELECTIONFEATURE SELECTION –class delimitation (disjunct classes) –projection function –methods: - vectorial: principal components analysisprincipal components analysis discriminant analysisdiscriminant analysis - structural: - structural: feature hierarchyfeature hierarchy ‘CLASSIFIER’ SYNTHESIS (decision function)‘CLASSIFIER’ SYNTHESIS (decision function) –geometrical / statistical / syntactic rules

E n d