Artificial Intelligence Project 1 Neural Networks Biointelligence Lab School of Computer Sci. & Eng. Seoul National University.

Slides:



Advertisements
Similar presentations
Lazy Paired Hyper-Parameter Tuning
Advertisements

Principle Components & Neural Networks How I finished second in Mapping Dark Matter Challenge Sergey Yurgenson, Harvard University Pasadena, 2011.
AI Practice 05 / 07 Sang-Woo Lee. 1.Usage of SVM and Decision Tree in Weka 2.Amplification about Final Project Spec 3.SVM – State of the Art in Classification.
Indian Statistical Institute Kolkata
CS771 Machine Learning : Tools, Techniques & Application Gaurav Krishna Y Harshit Maheshwari Pulkit Jain Sayantan Marik
Credibility: Evaluating what’s been learned. Evaluation: the key to success How predictive is the model we learned? Error on the training data is not.
Supervised classification performance (prediction) assessment Dr. Huiru Zheng Dr. Franscisco Azuaje School of Computing and Mathematics Faculty of Engineering.
1 Automated Feature Abstraction of the fMRI Signal using Neural Network Clustering Techniques Stefan Niculescu and Tom Mitchell Siemens Medical Solutions,
Data Mining: Discovering Information From Bio-Data Present by: Hongli Li & Nianya Liu University of Massachusetts Lowell.
Lasso regression. The Goals of Model Selection Model selection: Choosing the approximate best model by estimating the performance of various models Goals.
General Mining Issues a.j.m.m. (ton) weijters Overfitting Noise and Overfitting Quality of mined models (some figures are based on the ML-introduction.
Chapter 5 Data mining : A Closer Look.
Attention Deficit Hyperactivity Disorder (ADHD) Student Classification Using Genetic Algorithm and Artificial Neural Network S. Yenaeng 1, S. Saelee 2.
Medical Diagnosis via Genetic Programming Project #2 Artificial Intelligence: Biointelligence Computational Neuroscience Connectionist Modeling of Cognitive.
197 Case Study: Predicting Breast Cancer Invasion with Artificial Neural Networks on the Basis of Mammographic Features MEDINFO 2004, T02: Machine Learning.
Project 1: Classification Using Neural Networks Kim, Kwonill Biointelligence laboratory Artificial Intelligence.
Short Introduction to Machine Learning Instructor: Rada Mihalcea.
Prediction model building and feature selection with SVM in breast cancer diagnosis Cheng-Lung Huang, Hung-Chang Liao, Mu- Chen Chen Expert Systems with.
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
WELCOME. Malay Mitra Lecturer in Computer Science & Application Jalpaiguri Polytechnic West Bengal.
Biological data mining by Genetic Programming AI Project #2 Biointelligence lab Cho, Dong-Yeon
Project 1: Machine Learning Using Neural Networks Ver 1.1.
Gap filling of eddy fluxes with artificial neural networks
Neural and Evolutionary Computing - Lecture 9 1 Evolutionary Neural Networks Design  Motivation  Evolutionary training  Evolutionary design of the architecture.
Manu Chandran. Outline Background and motivation Over view of techniques Cross validation Bootstrap method Setting up the problem Comparing AIC,BIC,Crossvalidation,Bootstrap.
Artificial Intelligence Chapter 3 Neural Networks Artificial Intelligence Chapter 3 Neural Networks Biointelligence Lab School of Computer Sci. & Eng.
Artificial Intelligence Project 1 Neural Networks Biointelligence Lab School of Computer Sci. & Eng. Seoul National University.
Project 2: Classification Using Genetic Programming Kim, MinHyeok Biointelligence laboratory Artificial.
Artificial Intelligence Project #3 : Diagnosis Using Bayesian Networks May 19, 2005.
Artificial Intelligence Project 1 Neural Networks Biointelligence Lab School of Computer Sci. & Eng. Seoul National University.
Solving Function Optimization Problems with Genetic Algorithms September 26, 2001 Cho, Dong-Yeon , Tel:
Project 1: Classification Using Neural Networks Kim, Kwonill Biointelligence laboratory Artificial Intelligence.
Introduction to Weka ML Seminar for Rookies Byoung-Hee Kim Biointelligence Lab, Seoul National University.
Artificial Intelligence Project 1 Neural Networks Biointelligence Lab School of Computer Sci. & Eng. Seoul National University.
Evaluating Classifiers Reading: T. Fawcett, An introduction to ROC analysis, Sections 1-4, 7 (linked from class website)An introduction to ROC analysis.
K nearest neighbors algorithm Parallelization on Cuda PROF. VELJKO MILUTINOVIĆ MAŠA KNEŽEVIĆ 3037/2015.
Intelligent Database Systems Lab Presenter : Fen-Rou Ciou Authors : Hamdy K. Elminir, Yosry A. Azzam, Farag I. Younes 2007,ENERGY Prediction of hourly.
Symbolic Regression via Genetic Programming AI Project #2 Biointelligence lab Cho, Dong-Yeon
Evaluating Classifiers. Reading for this topic: T. Fawcett, An introduction to ROC analysis, Sections 1-4, 7 (linked from class website)
Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability Primer Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability.
Understanding Business Intelligence with Neural Networks James McCaffrey Microsoft Research Labs Wednesday, May 4, :15 – 4:00 PM Room Breakers CD.
語音訊號處理之初步實驗 NTU Speech Lab 指導教授: 李琳山 助教: 熊信寬
Artificial Intelligence Chapter 7 Agents That Plan Biointelligence Lab School of Computer Sci. & Eng. Seoul National University.
TEMPLATE DESIGN © CONTINUOUS GLUCOSE MONITORING, ORAL GLUCOSE TOLERANCE, AND INSULIN – GLUCOSE PARAMETERS IN ADOLESCENTS.
Artificial Intelligence DNA Hypernetworks Biointelligence Lab School of Computer Sci. & Eng. Seoul National University.
Bayesian Brain - Chapter 11 Neural Models of Bayesian Belief Propagation Rajesh P.N. Rao Summary by B.-H. Kim Biointelligence Lab School of.
Evaluating Classifiers
Glucose tolerance testing. A
Medical Diagnosis via Genetic Programming
Take it Easy with your Diabetes
Classification with Perceptrons Reading:
Machine Learning Dr. Mohamed Farouk.
Predict House Sales Price
Artificial Intelligence Project 2 Genetic Algorithms
Machine Learning Today: Reading: Maria Florina Balcan
Optimization and Learning via Genetic Programming
Project 1: Text Classification by Neural Networks
Artificial Intelligence Chapter 3 Neural Networks
Artificial Intelligence Chapter 2 Stimulus-Response Agents
network of simple neuron-like computing elements
A randomized controlled trial of a web-based education intervention for women with gestational diabetes mellitus  Mary Carolan-Olah, Padaphet Sayakhot 
Artificial Intelligence Chapter 20 Learning and Acting with Bayes Nets
Detecting Myocardial Infarctions (Heart Attack) using Neural Network
Artificial Intelligence Chapter 3 Neural Networks
Biointelligence Lab School of Computer Sci. & Eng.
A Classification Data Set for PLM
Artificial Intelligence Chapter 3 Neural Networks
Artificial Intelligence Chapter 3 Neural Networks
Assignment 7 Due Application of Support Vector Machines using Weka software Must install libsvm Data set: Breast cancer diagnostics Deliverables:
Artificial Intelligence Chapter 3 Neural Networks
Presentation transcript:

Artificial Intelligence Project 1 Neural Networks Biointelligence Lab School of Computer Sci. & Eng. Seoul National University

(C) SNU CSE BioIntelligence Lab 2 Outline Classification Problems  Task 1  Estimate several statistics on Diabetes data set  Task 2  Given unknown data set, find the performance as good as you can get  The test data is hidden.

(C) SNU CSE BioIntelligence Lab 3 Network Structure (1) … positive negative f pos (x) > f neg (x),→ x is postive

(C) SNU CSE BioIntelligence Lab 4 Network Structure (2) … f (x) > thres,→ x is postive

Medical Diagnosis: Diabetes

(C) SNU CSE BioIntelligence Lab 6 Pima Indian Diabetes Data (768)  8 Attributes  Number of times pregnant  Plasma glucose concentration in an oral glucose tolerance test  Diastolic blood pressure (mm/Hg)  Triceps skin fold thickness (mm)  2-hour serum insulin (mu U/ml)  Body mass index (kg/m 2 )  Diabetes pedigree function  Age (year)  Positive: 500, negative: 268

(C) SNU CSE BioIntelligence Lab 7 Report (1/4) Number of Epochs

(C) SNU CSE BioIntelligence Lab 8 Report (2/4) Number of Hidden Units  At least, 10 runs for each setting # Hidden Units TrainTest Average  SD BestWorst Average  SD BestWorst Setting 1 Setting 2 Setting 3 

(C) SNU CSE BioIntelligence Lab 9 Report (3/4)

(C) SNU CSE BioIntelligence Lab 10 Report (4/4) Normalization method you applied. Other parameters setting  Learning rates  Threshold value with which you predict an example as positive.  If f(x) > thres, you can say it is postive, otherwise negative.

(C) SNU CSE BioIntelligence Lab 11 Challenge (1) Unknown Data  Data for you: 2000 examples  Pos: 1000, Neg: 1000 Test data  600 examples  Pos: 300, Neg: 300  Labels are HIDDEN!

(C) SNU CSE BioIntelligence Lab 12 Challenge (2) Data  Train.data : 2000 x 500 (2000 examples with 500dim)  Train.labels: positive 1, negative 0  Test.data: 600 x 500 (600 examples with 500 dim)  Test.labels: not given to you. Verify your NN at 

(C) SNU CSE BioIntelligence Lab 13 Challenge (3) K-fold Cross Validation  The data set is randomly divided into k subsets.  One of the k subsets is used as the test set and the other k-1 subsets are put together to form a training set. 200 D1D1 D2D2 D3D3 D8D8 D9D9 D D1D1 D2D2 D3D3 D8D8 D9D9 D D2D2 D3D3 D4D4 D8D8 D9D9 D 10 … … …

(C) SNU CSE BioIntelligence Lab 14 Challenge (4) Include followings at your report  The best performance you achieved.  The spec of your NN when achieving the performance.  Structure of NN  Learning epochs  Your techniques  Other remarks… True Predict PositiveNegative Positive Negative Confusion matrix

(C) SNU CSE BioIntelligence Lab 15 References Source Codes  Free softwares  NN libraries (C, C++, JAVA, …)  MATLAB Tool box  Weka Web sites 

(C) SNU CSE BioIntelligence Lab 16 Pay Attention! Due (October 14, 2004): until pm 11:59 Submission  Results obtained from your experiments  Compress the data  Via  Report: Hardcopy!!  Used software and running environments  Results for many experiments with various parameter settings  Analysis and explanation about the results in your own way

(C) SNU CSE BioIntelligence Lab 17 Optional Experiments Various learning rate Number of hidden layers Different k values Output encoding