PSMS for Neural Networks on the Agnostic vs Prior Knowledge Challenge Hugo Jair Escalante, Manuel Montes and Enrique Sucar Computer Science Department.

Slides:



Advertisements
Similar presentations
Particle Swarm Optimization
Advertisements

Derivative-Free Optimization: Biogeography-Based Optimization Dan Simon Cleveland State University 1.
Particle swarm optimization for parameter determination and feature selection of support vector machines Shih-Wei Lin, Kuo-Ching Ying, Shih-Chieh Chen,
Application of Stacked Generalization to a Protein Localization Prediction Task Melissa K. Carroll, M.S. and Sung-Hyuk Cha, Ph.D. Pace University, School.
FOREST PLANNING USING PSO WITH A PRIORITY REPRESENTATION P.W. Brooks and W.D. Potter Institute for Artificial Intelligence, University of Georgia, USA.
Particle Swarm Optimization (PSO)  Kennedy, J., Eberhart, R. C. (1995). Particle swarm optimization. Proc. IEEE International Conference.
Feature/Model Selection by Linear Programming SVM, Combined with State-of-Art Classifiers: What Can We Learn About the Data Erinija Pranckeviciene, Ray.
Bart van Greevenbroek.  Authors  The Paper  Particle Swarm Optimization  Algorithm used with PSO  Experiment  Assessment  conclusion.
Model Selection and Assessment Using Cross-indexing Juha Reunanen ABB, Web Imaging Systems, Finland.
Fuzzy Inference System Learning By Reinforcement Presented by Alp Sardağ.
The information contained in this document pertains to software products and services that are subject to the controls of the Export Administration Regulations.
Retrieval Evaluation: Precision and Recall. Introduction Evaluation of implementations in computer science often is in terms of time and space complexity.
A Clustered Particle Swarm Algorithm for Retrieving all the Local Minima of a function C. Voglis & I. E. Lagaris Computer Science Department University.
Natural Computation: computational models inspired by nature Dr. Daniel Tauritz Department of Computer Science University of Missouri-Rolla CS347 Lecture.
Evaluating Classifiers
RESULTS OF THE WCCI 2006 PERFORMANCE PREDICTION CHALLENGE Isabelle Guyon Amir Reza Saffari Azar Alamdari Gideon Dror.
1 PSO-based Motion Fuzzy Controller Design for Mobile Robots Master : Juing-Shian Chiou Student : Yu-Chia Hu( 胡育嘉 ) PPT : 100% 製作 International Journal.
Particle Swarm Optimization Algorithms
RESULTS OF THE NIPS 2006 MODEL SELECTION GAME Isabelle Guyon, Amir Saffari, Gideon Dror, Gavin Cawley, Olivier Guyon, and many other volunteers, see
SWARM INTELLIGENCE IN DATA MINING Written by Crina Grosan, Ajith Abraham & Monica Chis Presented by Megan Rose Bryant.
CLOP A MATLAB® learning object package
Novel representations and methods in text classification Manuel Montes, Hugo Jair Escalante Instituto Nacional de Astrofísica, Óptica y Electrónica, México.
A Genetic Algorithms Approach to Feature Subset Selection Problem by Hasan Doğu TAŞKIRAN CS 550 – Machine Learning Workshop Department of Computer Engineering.
Baseline Methods for the Feature Extraction Class Isabelle Guyon Best BER=1.26  0.14% - n0=1000 (20%) – BER0=1.80% GISETTE Best BER=1.26  0.14% - n0=1000.
Suriya, A. September 19, 2015, Slide 0 Atipong Suriya School of MIME March 16, 2011 FE 640 : Term Project Presentation RFID Network Planning using Particle.
Cristian Urs and Ben Riveira. Introduction The article we chose focuses on improving the performance of Genetic Algorithms by: Use of predictive models.
Soft Computing Lecture 18 Foundations of genetic algorithms (GA). Using of GA.
Full model selection with heuristic search: a first approach with PSO Hugo Jair Escalante Computer Science Department, Instituto Nacional de Astrofísica,
Participation in the NIPS 2003 Challenge Theodor Mader ETH Zurich, Five Datasets were provided for experiments: ARCENE: cancer diagnosis.
Challenge Submissions for the Feature Extraction Class Georg Schneider my_classif=svc({'coef0=1', 'degree=3', 'gamma=0', 'shrinkage=1'});
(Particle Swarm Optimisation)
The Particle Swarm Optimization Algorithm Nebojša Trpković 10 th Dec 2010.
1 IE 607 Heuristic Optimization Particle Swarm Optimization.
2010 IEEE International Conference on Systems, Man, and Cybernetics (SMC2010) A Hybrid Particle Swarm Optimization Considering Accuracy and Diversity.
Robin McDougall Scott Nokleby Mechatronic and Robotic Systems Laboratory 1.
Particle Swarm Optimization James Kennedy & Russel C. Eberhart.
Blondie24 Presented by Adam Duffy and Josh Hill. Overview Introduction to new concepts Design of Blondie24 Testing and results Other approaches to checkers.
Particle Swarm Optimization † Spencer Vogel † This presentation contains cheesy graphics and animations and they will be awesome.
SwinTop: Optimizing Memory Efficiency of Packet Classification in Network Author: Chen, Chang; Cai, Liangwei; Xiang, Yang; Li, Jun Conference: Communication.
Faculty of Information Engineering, Shenzhen University Liao Huilian SZU TI-DSPs LAB Aug 27, 2007 Optimizer based on particle swarm optimization and LBG.
Identifying “Best Bet” Web Search Results by Mining Past User Behavior Author: Eugene Agichtein, Zijian Zheng (Microsoft Research) Source: KDD2006 Reporter:
Particle Swarm Optimization (PSO)
Learning to Rank: From Pairwise Approach to Listwise Approach Authors: Zhe Cao, Tao Qin, Tie-Yan Liu, Ming-Feng Tsai, and Hang Li Presenter: Davidson Date:
A distributed PSO – SVM hybrid system with feature selection and parameter optimization Cheng-Lung Huang & Jian-Fan Dun Soft Computing 2008.
On the Optimality of the Simple Bayesian Classifier under Zero-One Loss Pedro Domingos, Michael Pazzani Presented by Lu Ren Oct. 1, 2007.
A Presentation on Adaptive Neuro-Fuzzy Inference System using Particle Swarm Optimization and it’s Application By Sumanta Kundu (En.R.No.
Ch 1. Introduction Pattern Recognition and Machine Learning, C. M. Bishop, Updated by J.-H. Eom (2 nd round revision) Summarized by K.-I.
 Introduction  Particle swarm optimization  PSO algorithm  PSO solution update in 2-D  Example.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology 1 Intelligent Exploration for Genetic Algorithms Using Self-Organizing.
Advanced Computing and Networking Laboratory
Particle Swarm Optimization (2)
Scientific Research Group in Egypt (SRGE)
Boosted Augmented Naive Bayes. Efficient discriminative learning of
Adnan Quadri & Dr. Naima Kaabouch Optimization Efficiency
Particle Swarm Optimization
PSO -Introduction Proposed by James Kennedy & Russell Eberhart in 1995
Meta-heuristics Introduction - Fabien Tricoire
آموزش شبکه عصبی با استفاده از روش بهینه سازی PSO
CSE 4705 Artificial Intelligence
Weihua Gao Ganapathi Kamath Kalyan Veeramachaneni Lisa Osadciw
Advanced Artificial Intelligence Evolutionary Search Algorithm
Training Neural networks to play checkers
A weight-incorporated similarity-based clustering ensemble method based on swarm intelligence Yue Ming NJIT#:
Metaheuristic methods and their applications. Optimization Problems Strategies for Solving NP-hard Optimization Problems What is a Metaheuristic Method?
Supervisor: Yury Nikulin Key research questions:
Approaching an ML Problem
Dr. Unnikrishnan P.C. Professor, EEE
Shih-Wei Lin, Kuo-Ching Ying, Shih-Chieh Chen, Zne-Jung Lee
View Inter-Prediction GAN: Unsupervised Representation Learning for 3D Shapes by Learning Global Shape Memories to Support Local View Predictions 1,2 1.
MILESTONE RESULTS Mar. 1st, 2007
Presentation transcript:

PSMS for Neural Networks on the Agnostic vs Prior Knowledge Challenge Hugo Jair Escalante, Manuel Montes and Enrique Sucar Computer Science Department National Astrophysics, Optics and Electronics, México IJCNN-2007 ALvsPK ChallengeOrlando, Florida, August 17, 2007

Outline Introduction Particle swarm optimization Particle swarm model selection Results Conclusions

Introduction: model selection Agnostic learning –General purpose methods –No knowledge on the task at hand or on machine learning is required Prior knowledge –Prior knowledge can increases model’s accuracy –Expert domain is needed

Introduction Problem: Given a set of preprocessing methods, feature selection and learning algorithms (CLOP), select the best combination of them, together with their hyperparameters Solution: Bio-inspired search strategy (PSO) Bird flocking Fish schooling

Particle swarm optimization (PSO) A population of individuals is created (Swarm) Each individual (particle) represents a solution to the problem at hand Particles fly through the search space by considering the best global and individual solutions A fitness function is used for evaluating solutions

Particle swarm optimization (PSO) Begin –Initialize swarm –Locate leader (p g ) –it=0 –While it < max_it For each particle –Update Position (2) –Evaluation (fitness) –Update particle’s best (p) EndFor Update leader (p g ) it++; –EndWhile End

PSO for model selection (PSMS) Each particle encodes a CLOP model Cross-validation BER is used for evaluating models

Experiment's settings Standard parameters for PSO 10 particles per swarm PSMS applied to ADA, GINA, HIVA and SYLVA 5-cross validation was used

Results up to March 1 st Corrida_final 500 iterations for ADA 100 iterations for HIVA, GIVA 50 iterations for SYLVA Trial and error for NOVA

Results up to March 1 st Best ave. BER still held by Reference (Gavin Cawley) with “the bad”. Note that the best entry for each dataset is not necessarily the best entry overall. Some of the best agnostic entries of individual datasets were made as part of prior knowledge entries (the bottom four); there is no corresponding overall agnostic ranking. Agnostic learning best ranked entries as of March 1 st, 2007

Results up to March 1 st 100 Iterations 500 Iterations (ADA)

Results up to August 1 st DatasetEntry nameModelsEntry IDTest BERTest AUCScore ADACorrida_final_10CV chain({standardize({'center=0'}), normalize({'center=1'}),shift_n_s cale({'take_log=0'}), neural({'units=5', 'shrinkage=1.4323','balance=0',' maxiter=257'}), bias}) GINAAdaBoost * chain({normalize({'center=0'}), svc({'coef0=0.1', 'degree=5','gamma=0','shrinkag e=0.01'}), bias}) HIVACorrida_final chain({standardize({'center=1'}), normalize({'center=0'}), neural({'units=5', 'shrinkage=3.028','balance=0','m axiter=448'}), bias}) NOVAAdaBoost * chain({normalize({'center=0'}), gentleboost(neural({'units=1', 'shrinkage=0.2', 'balance=1', 'maxiter=50'}), {'units=10','rejNum=3'}), bias}) SYLVAPSMS_100_4all_NCV chain({standardize({'center=0'}), normalize({'center=0'}),shift_n_s cale({'center=1'}), neural({'units=8', 'shrinkage=1.2853','balance=0',' maxiter=362'}), bias}) OverallPSMS_100_4all_NCV Same as Corrida_final except by Sylva’s model * Models selected by trial and error

Results up to August 1 st Best ave. BER still held by Reference (Gavin Cawley) with “the bad”. Note that the best entry for each dataset is not necessarily the best entry overall. The blue shaded entries did not count towards the prize (participant part of a group or not wishing to be identified). Agnostic learning best ranked entries as of August 1 st, 2007

Results up to August 1 st BER AUC

Conclusions Competitive and simple models are obtained with PSMS No knowledge on the problem at hand neither on machine learning is required PSMS is easy to implement It suffers from the same problem as other search algorithms