WELCOME. Malay Mitra Lecturer in Computer Science & Application Jalpaiguri Polytechnic West Bengal.

Slides:



Advertisements
Similar presentations
Perceptron Lecture 4.
Advertisements

AIME03, Oct 21, 2003 Classification of Ovarian Tumors Using Bayesian Least Squares Support Vector Machines C. Lu 1, T. Van Gestel 1, J. A. K. Suykens.
Rule extraction in neural networks. A survey. Krzysztof Mossakowski Faculty of Mathematics and Information Science Warsaw University of Technology.
Prachi Saraph, Mark Last, and Abraham Kandel. Introduction Black-Box Testing Apply an Input Observe the corresponding output Compare Observed output with.
WRSTA, 13 August, 2006 Rough Sets in Hybrid Intelligent Systems For Breast Cancer Detection By Aboul Ella Hassanien Cairo University, Faculty of Computer.
Machine Learning Neural Networks
Simple Neural Nets For Pattern Classification
RBF Neural Networks x x1 Examples inside circles 1 and 2 are of class +, examples outside both circles are of class – What NN does.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Final Project: Project 9 Part 1: Neural Networks Part 2: Overview of Classifiers Aparna S. Varde April 28, 2005 CS539: Machine Learning Course Instructor:
Speaker Adaptation for Vowel Classification
Ensemble Learning: An Introduction
1 MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING By Kaan Tariman M.S. in Computer Science CSCI 8810 Course Project.
Radial-Basis Function Networks
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
Ranga Rodrigo April 5, 2014 Most of the sides are from the Matlab tutorial. 1.
Attention Deficit Hyperactivity Disorder (ADHD) Student Classification Using Genetic Algorithm and Artificial Neural Network S. Yenaeng 1, S. Saelee 2.
CHAPTER 12 ADVANCED INTELLIGENT SYSTEMS © 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang.
ENDA MOLLOY, ELECTRONIC ENG. FINAL PRESENTATION, 31/03/09. Automated Image Analysis Techniques for Screening of Mammography Images.
Machine Learning1 Machine Learning: Summary Greg Grudic CSCI-4830.
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
Chapter 9 Neural Network.
Neural Networks AI – Week 23 Sub-symbolic AI Multi-Layer Neural Networks Lee McCluskey, room 3/10
© Copyright 2004 ECE, UM-Rolla. All rights reserved A Brief Overview of Neural Networks By Rohit Dua, Samuel A. Mulder, Steve E. Watkins, and Donald C.
11 CSE 4705 Artificial Intelligence Jinbo Bi Department of Computer Science & Engineering
Matlab Matlab Sigmoid Sigmoid Perceptron Perceptron Linear Linear Training Training Small, Round Blue-Cell Tumor Classification Example Small, Round Blue-Cell.
Introduction to Artificial Neural Network Models Angshuman Saha Image Source: ww.physiol.ucl.ac.uk/fedwards/ ca1%20neuron.jpg.
NEURAL NETWORKS FOR DATA MINING
Introduction to machine learning and data mining 1 iCSC2014, Juan López González, University of Oviedo Introduction to machine learning Juan López González.
Artificial Intelligence Lecture No. 29 Dr. Asad Ali Safi ​ Assistant Professor, Department of Computer Science, COMSATS Institute of Information Technology.
Artificial Intelligence Methods Neural Networks Lecture 4 Rakesh K. Bissoondeeal Rakesh K. Bissoondeeal.
Artificial Intelligence Techniques Multilayer Perceptrons.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 21 Oct 28, 2005 Nanjing University of Science & Technology.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
Gap filling of eddy fluxes with artificial neural networks
EMBC2001 Using Artificial Neural Networks to Predict Malignancy of Ovarian Tumors C. Lu 1, J. De Brabanter 1, S. Van Huffel 1, I. Vergote 2, D. Timmerman.
Neural and Evolutionary Computing - Lecture 9 1 Evolutionary Neural Networks Design  Motivation  Evolutionary training  Evolutionary design of the architecture.
Intelligent Numerical Computation1 Numerical Analysis MATLAB programming Numerical Methods Applications Contents.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Image Source: ww.physiol.ucl.ac.uk/fedwards/ ca1%20neuron.jpg
Neural Networks Vladimir Pleskonjić 3188/ /20 Vladimir Pleskonjić General Feedforward neural networks Inputs are numeric features Outputs are in.
Neural Networks 2nd Edition Simon Haykin
Perceptrons Michael J. Watts
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
An Effective Hybridized Classifier for Breast Cancer Diagnosis DISHANT MITTAL, DEV GAURAV & SANJIBAN SEKHAR ROY VIT University, India.
Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer.
THIRD CLASSIFICATION OF MICROCALCIFICATION STAGES IN MAMMOGRAPHIC IMAGES THIRD REVIEW Supervisor: Mrs.P.Valarmathi HOD/CSE Project Members: M.HamsaPriya( )
Classification of Breast Cancer Cells Using Artificial Neural Networks and Support Vector Machines Emmanuel Contreras Guzman.
Intro. ANN & Fuzzy Systems Lecture 11. MLP (III): Back-Propagation.
Data Mining: Concepts and Techniques1 Prediction Prediction vs. classification Classification predicts categorical class label Prediction predicts continuous-valued.
A Presentation on Adaptive Neuro-Fuzzy Inference System using Particle Swarm Optimization and it’s Application By Sumanta Kundu (En.R.No.
A Document-Level Sentiment Analysis Approach Using Artificial Neural Network and Sentiment Lexicons Yan Zhu.
Hybrid Ant Colony Optimization-Support Vector Machine using Weighted Ranking for Feature Selection and Classification.
Ensemble Classifiers.
Schizophrenia Classification Using
Prof. Carolina Ruiz Department of Computer Science
Machine Learning Today: Reading: Maria Florina Balcan
By: Behrouz Rostami, Zeyun Yu Electrical Engineering Department
Neuro-Computing Lecture 4 Radial Basis Function Network
network of simple neuron-like computing elements
MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING
MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING
The Naïve Bayes (NB) Classifier
Prepared by: Mahmoud Rafeek Al-Farra
Somi Jacob and Christian Bach
CS+Social Good.
Modeling IDS using hybrid intelligent systems
Credit Card Fraudulent Transaction Detection
Prof. Carolina Ruiz Department of Computer Science
Presentation transcript:

WELCOME

Malay Mitra Lecturer in Computer Science & Application Jalpaiguri Polytechnic West Bengal

A Neural Network Based Intelligent System for Breast Cancer Diagnosis Supervised By : Prof. (Dr.) Ranjit Kumar Samanta Department Of Computer Sc. & Appl. University Of North Bengal West Bengal, India.

Why on Breast Cancer According to ACS, the new breast cancer cases are 229,060 in US only in It ranks second as a cause of death in woman after lung cancer. One of the reasons of survival of 2.5 million breast cancer patients in US is early diagnosis.

Different methods for detecting Breast Cancer Biopsy Accuracy : 100% Mammography Accuracy : 68% to 79% FNAC Accuracy : 65% to 98%

Points which are noted from previous studies Much of the works having high classification accuracy are based on hybrid approach. NN and SVM with various combinations like LS, f-score,RS lead to higher classification accuracy. From some work it is not clear that the accuracy they obtained are the result of best simulation product or average of several simulations.

Wisconsin Breast Cancer Database Wisconsin Breast Cancer data description with statistics # Attribute Domain Mean SD 1.Sample Code number id number Clump thickness Uniformity of cell size Uniformity of cell shape Marginal adhesion Single epithelial cell size Bare nuclei Bland chromatin Normal nuclei Mitosis Class ( 2 for benign, 4 for malignant )

Feature Extraction and Reduction CFS (Correlation based Feature Selection) AR (Association Rule) RS (Rough Set)

Correlation based feature subset selection Where, Correlation between summed features and class variable k Number of features Average inter-correlation between features Average of the correlation between features

Rough Set Let there be an information system I = (U,A) For any P A, the equivalence relation IND(P) called P-indiscernibility relation. Let X U, be the target set and can be approximated by P-lower (PX) and P-upper (PX) approximation. Accuracy of the rough set (PX,PX) = |PX| / |PX| where U be the universe discourse and a nonempty set and A is a nonempty set of attributes.

Artificial Neural Network (ANN) Mc Culloch – Pitts model of a neuron : : Summing PartOutput Function Feature vectors as inputs Feature weights as synapse as bias Activation value Output

Modeling with ANN Modeling with ANN involves : i) Designing the network ii) Training the network Design of network involves : i) fixing the number of layers ii) fixing the number of neurons in each layer iii) the node function for each neuron iv) the form of network whether feed-forward or feedback type v) the connectivity patterns between the layers and neurons. Training phase involves : i) Adjustments of weights as well as threshold values from a set of training examples.

Levenberg Marquardt (LM) algorithm During the iteration the new configuration of weights in step k+1 is :- Where J the Jacobian matrix the adjustable parameter error vector

Applications The schematic view of our system :- Wisconsin Breast Cancer Database (Original) Feature extraction and reduction using CFS & RS Classification using two combinations Decision space : 2. Benign 4. Malignant

Data Preprocessing We completely randomize the dataset after discarding the records with missing values. There is no outlier in our dataset. The dataset is partitioned into three sets- 1. Training set ( 68% ) 2. Validation set ( 16% ) 3. Test set ( 16% )

Feature Selection & Extraction The reduced feature sets after applying CFS & RS Sr. No. Reduced attributes ( CFS ) Reduced attributes ( RS ) 1.Clump thickness 2.Uniformity of cell sizeUniformity of cell shape 3.Bare nuclei 4.Bland chromatinMarginal adhesion 5.Normal nucleiMitosis

Network Architecture i) This work uses logistic function of the form f(x)=1/(1+e x ) in the hidden & output nodes. ii) This work uses one input layer, one hidden layer and one output layer. iii) Number of neurons in hidden layer is evaluated from the formula proposed by Goa s=√(a 1 m 2 +a 2 mn+a 3 n 2 +a 4 m+a 5 n+a 6 )+a 7 where s : number of neurons, m : number of inputs n : number of outputs a 1 to a 7 are undefined coefficients. Using LMS, Huang derived a formula as: s=√(0.43mn+0.12n m+0.77n+0.35)+0.51 In this study m=5, n=1 and hence s=5 (after round off)

Modeling Results WEKA was used for feature set reduction using CFS. RSES was used for feature set reduction using RS. The classification algorithm using these two combinations were implemented in Alyuda NeuroIntelligence. Types Classifier Network Structure I HL O Epochs (Retrains) Number of patterns Training Validation Testing CFS+LM (10) RS+LM (10)

Performance Evaluation Method As performance measure we compute :

Experimental Results Table shows the compiled results of 120 simulations MethodsTest set (CCR%)SpecificitySensitivityAUC Highest (freq) Lowest (freq) Avg.Highest (freq) Lowest (freq) Avg.Highest (freq) Lowest (freq) Avg.Highest (freq) Lowest (freq) Avg. CFS + LM100(6)94.29 (4) (19) (1) (38) (1) (10) 94(1) RS + LM100(3)94.33 (1) (16) (1) (40) (1) (5) 93(1) 99.11

Observations noted Out of two methods CFS+LM shows better performance in terms of CCR, Sensitivity, Specificity and AUC. Our methods provide 100% CCR as the highest performance which is comparable to other studies. The lowest CCR is 94.29%.

Conclusion This work presents here the highest, lowest and average behavior of the methods used. This work provides a better result as compared to the result obtained from much of the previous studies. It is proposed that CFS-derived features set would have been worthwhile when the final decision is made by doctors. Moreover the highest, lowest and average performance of a DSS should be judged by a user of the system before using.

Thank You