Breast Cancer Diagnosis via Neural Network Classification Jing Jiang May 10, 2000.

Slides:



Advertisements
Similar presentations
ECG Signal processing (2)
Advertisements

Data Mining Classification: Alternative Techniques
An Introduction of Support Vector Machine
Support Vector Machines
Heterogeneous Forests of Decision Trees Krzysztof Grąbczewski & Włodzisław Duch Department of Informatics, Nicholas Copernicus University, Torun, Poland.
Artificial Neural Networks
Fuzzy Support Vector Machines (FSVMs) Weijia Wang, Huanren Zhang, Vijendra Purohit, Aditi Gupta.
Prénom Nom Document Analysis: Linear Discrimination Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
1 Classification: Definition Given a collection of records (training set ) Each record contains a set of attributes, one of the attributes is the class.
Slide 1 EE3J2 Data Mining EE3J2 Data Mining Lecture 15: Introduction to Artificial Neural Networks Martin Russell.
Artificial Neural Networks Artificial Neural Networks are (among other things) another technique for supervised learning k-Nearest Neighbor Decision Tree.
Principle of Locality for Statistical Shape Analysis Paul Yushkevich.
Supervised Distance Metric Learning Presented at CMU’s Computer Vision Misc-Read Reading Group May 9, 2007 by Tomasz Malisiewicz.
1 MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING By Kaan Tariman M.S. in Computer Science CSCI 8810 Course Project.
Support Vector Machines
1 Diagnosing Breast Cancer with Ensemble Strategies for a Medical Diagnostic Decision Support System David West East Carolina University Paul Mangiameli.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
CHAPTER 11 Back-Propagation Ming-Feng Yeh.
Review Rong Jin. Comparison of Different Classification Models  The goal of all classifiers Predicating class label y for an input x Estimate p(y|x)
Breast Cancer Diagnosis A discussion of methods Meena Vairavan.
Incomplete Graphical Models Nan Hu. Outline Motivation K-means clustering Coordinate Descending algorithm Density estimation EM on unconditional mixture.
JM - 1 Introduction to Bioinformatics: Lecture VIII Classification and Supervised Learning Jarek Meller Jarek Meller Division.
Breast Cancer Diagnosis via Linear Hyper-plane Classifier Presented by Joseph Maalouf December 14, 2001 December 14, 2001.
This week: overview on pattern recognition (related to machine learning)
Machine Learning1 Machine Learning: Summary Greg Grudic CSCI-4830.
Prediction model building and feature selection with SVM in breast cancer diagnosis Cheng-Lung Huang, Hung-Chang Liao, Mu- Chen Chen Expert Systems with.
CS 8751 ML & KDDSupport Vector Machines1 Support Vector Machines (SVMs) Learning mechanism based on linear programming Chooses a separating plane based.
ADVANCED CLASSIFICATION TECHNIQUES David Kauchak CS 159 – Fall 2014.
COMMON EVALUATION FINAL PROJECT Vira Oleksyuk ECE 8110: Introduction to machine Learning and Pattern Recognition.
Machine Learning Lecture 11 Summary G53MLE | Machine Learning | Dr Guoping Qiu1.
Introduction to machine learning and data mining 1 iCSC2014, Juan López González, University of Oviedo Introduction to machine learning Juan López González.
1 SUPPORT VECTOR MACHINES İsmail GÜNEŞ. 2 What is SVM? A new generation learning system. A new generation learning system. Based on recent advances in.
Classification and Ranking Approaches to Discriminative Language Modeling for ASR Erinç Dikici, Murat Semerci, Murat Saraçlar, Ethem Alpaydın 報告者:郝柏翰 2013/01/28.
An Introduction to Support Vector Machine (SVM) Presenter : Ahey Date : 2007/07/20 The slides are based on lecture notes of Prof. 林智仁 and Daniel Yeung.
Machine Learning Using Support Vector Machines (Paper Review) Presented to: Prof. Dr. Mohamed Batouche Prepared By: Asma B. Al-Saleh Amani A. Al-Ajlan.
Max-Margin Classification of Data with Absent Features Presented by Chunping Wang Machine Learning Group, Duke University July 3, 2008 by Chechik, Heitz,
Linear Discrimination Reading: Chapter 2 of textbook.
Non-Bayes classifiers. Linear discriminants, neural networks.
Intelligent Database Systems Lab N.Y.U.S.T. I. M. A fast nearest neighbor classifier based on self-organizing incremental neural network (SOINN) Neuron.
An Introduction to Support Vector Machine (SVM)
Neural Text Categorizer for Exclusive Text Categorization Journal of Information Processing Systems, Vol.4, No.2, June 2008 Taeho Jo* 報告者 : 林昱志.
Neural Network Classification versus Linear Programming Classification in breast cancer diagnosis Denny Wibisono December 10, 2001.
Classification (slides adapted from Rob Schapire) Eran Segal Weizmann Institute.
1  The Problem: Consider a two class task with ω 1, ω 2   LINEAR CLASSIFIERS.
An Artificial Neural Network Approach to Surface Waviness Prediction in Surface Finishing Process by Chi Ngo ECE/ME 539 Class Project.
Support vector machine LING 572 Fei Xia Week 8: 2/23/2010 TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A 1.
On Utillizing LVQ3-Type Algorithms to Enhance Prototype Reduction Schemes Sang-Woon Kim and B. John Oommen* Myongji University, Carleton University*
1  Problem: Consider a two class task with ω 1, ω 2   LINEAR CLASSIFIERS.
Fast Learning in Networks of Locally-Tuned Processing Units John Moody and Christian J. Darken Yale Computer Science Neural Computation 1, (1989)
Chapter 13 (Prototype Methods and Nearest-Neighbors )
Fast Query-Optimized Kernel Machine Classification Via Incremental Approximate Nearest Support Vectors by Dennis DeCoste and Dominic Mazzoni International.
ECE 471/571 - Lecture 19 Review 11/12/15. A Roadmap 2 Pattern Classification Statistical ApproachNon-Statistical Approach SupervisedUnsupervised Basic.
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Part 9: Review.
CSC321: Introduction to Neural Networks and Machine Learning Lecture 23: Linear Support Vector Machines Geoffrey Hinton.
SVMs in a Nutshell.
Fuzzy Pattern Recognition. Overview of Pattern Recognition Pattern Recognition Procedure Feature Extraction Feature Reduction Classification (supervised)
Debrup Chakraborty Non Parametric Methods Pattern Recognition and Machine Learning.
SUPPORT VECTOR MACHINES Presented by: Naman Fatehpuria Sumana Venkatesh.
Computer Vision Lecture 7 Classifiers. Computer Vision, Lecture 6 Oleh Tretiak © 2005Slide 1 This Lecture Bayesian decision theory (22.1, 22.2) –General.
Classification of Breast Cancer Cells Using Artificial Neural Networks and Support Vector Machines Emmanuel Contreras Guzman.
Classification of tissues and samples 指導老師:藍清隆 演講者:張許恩、王人禾.
An Introduction to Support Vector Machines
Machine Learning Week 1.
Today (2/11/16) Learning objectives (Sections 5.1 and 5.2):
MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING
MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING
Concave Minimization for Support Vector Machine Classifiers
Machine Learning Support Vector Machine Supervised Learning
Nonlinear Conjugate Gradient Method for Supervised Training of MLP
Discriminative Training
Presentation transcript:

Breast Cancer Diagnosis via Neural Network Classification Jing Jiang May 10, 2000

Outline Introduction and Motivation K-mean, k-nearest neighbor and maximum likelihood classification Back propagating multi-layer perceptron Support vector machine (SVM) Learning vector quantization (LVQ) Linear programming

Introduction and Motivation The data file contains the 30 attributes of both benign and malignant fine needle aspirates (FNAs). Our goals are to find a discriminating function to determine if an unknown sample is benign or malignant and choose a pair of the 30 attributes which will be used in diagnosis. Linear program has done a good job in solving this problem. We expect the neural network classification algorithms can be useful for this problem.

K-mean First we use k-mean algorithm to find the cluster of the training data set. K-mean algorithm doesn’t give up the discriminating function

KNN and ML For 100 nearest neighbors we have, For 20 nearest neighbors we have For maximum likelihood algorithm we have,

BP-MLP After careful choice of network parameters, we get the same Cmat and C-rate for the 30 attribute and any 2 attribute problem. It is interesting to note they are the same as the result we get for ML method The low classification rate can be due to the fact that the data is not linearly separable.

Support Vector Machine For attribute 1 and 23, we have 6 errors in the testing. For attribute 14 and 28, we have 8 errors in testing. It takes a long time to train a SVM for the 30 attribute problem, even for 2 attribute, it is time consuming too.

LVQ While using LVQ for attribute 1 and 23, the number of errors is 8. For attribute 14 and 18, we have 25 errors. The training is faster than SVQ, but so far we are only able to handle the 2 attribute problem, not a 30 attribute problem.

LVQ Training data and Weights

Linear Program The algorithm used is similar to SVM, but simpler. We device a separation plane and try to minimize the error. For 30 attribute we have only 3 errors For 2 attribute, the best combinations give 2 errors.

Linear Program

Conclusion We tried various neural network classification algorithm. It seems as far the simpler linear programming gives a better result. More exploration need to be done. BP is not very good at dealing with non-separable data. SVM is a good candidate, but takes a long time to train. LVQ is comparable with SVM. An question remain to be answered, why the maximum likelihood method give the same result as the BP.