Robot Recognition of Complex Swarm Behaviors

Slides:



Advertisements
Similar presentations
Naïve-Bayes Classifiers Business Intelligence for Managers.
Advertisements

Salvatore giorgi Ece 8110 machine learning 5/12/2014
Class Project Due at end of finals week Essentially anything you want, so long as it’s AI related and I approve Any programming language you want In pairs.
RBF Neural Networks x x1 Examples inside circles 1 and 2 are of class +, examples outside both circles are of class – What NN does.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
CONTENT BASED FACE RECOGNITION Ankur Jain 01D05007 Pranshu Sharma Prashant Baronia 01D05005 Swapnil Zarekar 01D05001 Under the guidance of Prof.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
A Technique for Advanced Dynamic Integration of Multiple Classifiers Alexey Tsymbal*, Seppo Puuronen**, Vagan Terziyan* *Department of Artificial Intelligence.
1 MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING By Kaan Tariman M.S. in Computer Science CSCI 8810 Course Project.
Redaction: redaction: PANAKOS ANDREAS. An Interactive Tool for Color Segmentation. An Interactive Tool for Color Segmentation. What is color segmentation?
Memory-Based Learning Instance-Based Learning K-Nearest Neighbor.
CS Instance Based Learning1 Instance Based Learning.
Radial-Basis Function Networks
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
Data Mining Techniques
Extracting Places and Activities from GPS Traces Using Hierarchical Conditional Random Fields Yong-Joong Kim Dept. of Computer Science Yonsei.
Inferno : Side-channel Attacks for Mobile Web Browsers Manuel Philipose, Matthew Halpern, Pavel Lifshits, Mark Silberstein, Mohit Tiwari Background and.
Machine Learning1 Machine Learning: Summary Greg Grudic CSCI-4830.
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Kumar Srijan ( ) Syed Ahsan( ). Problem Statement To create a Neural Networks based multiclass object classifier which can do rotation,
Artificial Neural Network Theory and Application Ashish Venugopal Sriram Gollapalli Ulas Bardak.
© Negnevitsky, Pearson Education, Will neural network work for my problem? Will neural network work for my problem? Character recognition neural.
Chapter 9 Neural Network.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 23 Nov 2, 2005 Nanjing University of Science & Technology.
Learning BlackJack with ANN (Aritificial Neural Network) Ip Kei Sam ID:
Distributed Anomaly Detection in Wireless Sensor Networks Ksutharshan Rajasegarar, Christopher Leckie, Marimutha Palaniswami, James C. Bezdek IEEE ICCS2006(Institutions.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 20 Oct 26, 2005 Nanjing University of Science & Technology.
Mixture Models, Monte Carlo, Bayesian Updating and Dynamic Models Mike West Computing Science and Statistics, Vol. 24, pp , 1993.
A hybrid SOFM-SVR with a filter-based feature selection for stock market forecasting Huang, C. L. & Tsai, C. Y. Expert Systems with Applications 2008.
1 Pattern Recognition Pattern recognition is: 1. A research area in which patterns in data are found, recognized, discovered, …whatever. 2. A catchall.
NEAREST NEIGHBORS ALGORITHM Lecturer: Yishay Mansour Presentation: Adi Haviv and Guy Lev 1.
A NOVEL METHOD FOR COLOR FACE RECOGNITION USING KNN CLASSIFIER
Behavior-based Multirobot Architectures. Why Behavior Based Control for Multi-Robot Teams? Multi-Robot control naturally grew out of single robot control.
Copyright © 2001, SAS Institute Inc. All rights reserved. Data Mining Methods: Applications, Problems and Opportunities in the Public Sector John Stultz,
Identifying “Best Bet” Web Search Results by Mining Past User Behavior Author: Eugene Agichtein, Zijian Zheng (Microsoft Research) Source: KDD2006 Reporter:
Example Apply hierarchical clustering with d min to below data where c=3. Nearest neighbor clustering d min d max will form elongated clusters!
Mete Ozay, Fatos T. Yarman Vural —Presented by Tianxiao Jiang
Pattern Recognition. What is Pattern Recognition? Pattern recognition is a sub-topic of machine learning. PR is the science that concerns the description.
Feasibility of Using Machine Learning Algorithms to Determine Future Price Points of Stocks By: Alexander Dumont.
Mustafa Gokce Baydogan, George Runger and Eugene Tuv INFORMS Annual Meeting 2011, Charlotte A Bag-of-Features Framework for Time Series Classification.
Machine Learning Artificial Neural Networks MPλ ∀ Stergiou Theodoros 1.
Neural Networks: An Introduction and Overview
Machine Learning Supervised Learning Classification and Regression
Machine Learning Models
Hierarchical Clustering: Time and Space requirements
Artificial Neural Networks
Supervised Training of Deep Networks
Dieudo Mulamba November 2017
Self organizing networks
4.2 Data Input-Output Representation
Lecture 22 Clustering (3).
Neural Networks Advantages Criticism
Robotic Search Engines for the Physical World
Competitive Networks.
network of simple neuron-like computing elements
An Improved Neural Network Algorithm for Classifying the Transmission Line Faults Slavko Vasilic Dr Mladen Kezunovic Texas A&M University.
Department of Electrical Engineering
CSSE463: Image Recognition Day 18
MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING
Competitive Networks.
MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING
Data Transformations targeted at minimizing experimental variance
A Suite to Compile and Analyze an LSP Corpus
CSSE463: Image Recognition Day 18
Neural Networks: An Introduction and Overview
Memory-Based Learning Instance-Based Learning K-Nearest Neighbor
Lecture 16. Classification (II): Practical Considerations
Pattern Recognition: Statistical and Neural
Machine Learning.
Presentation transcript:

Robot Recognition of Complex Swarm Behaviors Aisha Walcott-MAS622J-Dec. 11, 2006

Courtesy James McLurkin Introduction Dispersion Orbit A Swarm is a large collection of autonomous mobile robots No centralized control Group behaviors are produced from local interactions of many individual robots Goal is to develop a suite of primitive global behaviors that combine to form more complex group programs The goal of the Swam project is to developing techniques for programming a large swarm of autonomous mobile robots.  Each robot is autonomous and there is no central controller.  Instead, group behaviors are produced from the local interactions of many individual robots.  A programming system for the Swarm would have to reverse this process; given a desired group behavior, what combination of local behaviors are required?  Our approach is to design primitive global behaviors that can be recombined to form more complex group programs. http://people.csail.mit.edu/jamesm/presentations/SwarmInANutshell/ http://people.csail.mit.edu/jamesm/presentations/SwarmInANutshell/http://people.csail.mit.edu/jamesm/presentations/SwarmInANutshell/swarm3%20disperse.mpg http://people.csail.mit.edu/jamesm/presentations/SwarmInANutshell/swarm3%20orbit.mpg Courtesy James McLurkin

Build multi-classifiers to classify Complex Swarm Behaviors Project Goal Build multi-classifiers to classify Complex Swarm Behaviors Disperse Orbit Cluster Bubble Sort Example Features hi source source * source low source

Approach Collect raw behavior data sets Determine Features (8D) Pattern Recognition Algorithms KNN Neural Nets Bayes Nets Analyze results of each algorithm

KNN Tested a range of values for nearest neighbors random tie break Overall Correct Classification Average Class Classification Cluster= 100% Disperse = 12.5% Clump = 50% Orbit = 44% Bubble Sort = 82% k = 1 2 3 4 5 6 7 8 9 10 11 total_class_correct = 0.6250 0.6000 0.6500 0.5250 0.5500 in_class_correct = 0 0 6 1 0 0 5 3 10 0 0 6 1 0 0 5 4 8 0 0 6 1 0 0 5 4 10 0 0 6 1 0 0 0 4 10 0 0 6 1 0 0 0 4 11 0 0 6 1 0 0 1 4 10 tests_per_class = 0 0 6 8 0 0 6 8 12 NaN NaN 1.0000 0.1250 NaN NaN 0.8333 0.3750 0.8333 NaN NaN 1.0000 0.1250 NaN NaN 0.8333 0.5000 0.6667 NaN NaN 1.0000 0.1250 NaN NaN 0.8333 0.5000 0.8333 NaN NaN 1.0000 0.1250 NaN NaN 0 0.5000 0.8333 NaN NaN 1.0000 0.1250 NaN NaN 0 0.5000 0.9167 NaN NaN 1.0000 0.1250 NaN NaN 0.1667 0.5000 0.8333

Neural Nets Single Hidden Layer Layer 1: nodes [50,70] Max percent = 65% Logsig Two Hidden Layer Layer 1: nodes [50,70] Layer 2: nodes [25,25] Max percent = 65% 2 layer percentCorrect = Columns 1 through 12 0.3500 0.3500 0.3500 0.2500 0.5500 0.3000 0.6000 0.6000 0.4500 0.5000 0.3500 0.3000 Columns 13 through 21 0.6000 0.4500 0.2500 0.6500 0.4500 0.6500 0.4000 0.5500 0.2500

Bayes Nets Mapping to discrete domain by applying k-means clustering to each feature Preliminary Results Classification of Cluster Possible bug in code Modify the discrete mapping Cluster, Disperse,Clump,Orbit, Bubble Sort To ensure that a model of an environment is current, mobile robots must use information in the model to determine which regions in the environment from where to gather data. The data is, in turn, used to update the model. We refer to the process of continuously updating large-scale dynamic models of the physical world as world crawling. Database: 1) What is the object? 2) Where is the object relative to robot “me” 3) Where am I relative to home (or local origin)? 4) What time is it? 1 2 8

Discussion KNN and Neural Net performed well Determining the mapping from real numbers to a discrete domain may affect Bayes Nets classifiers Overall high classification of clustering behavior -Features tuned to behavior -Not enough variety of samples Need more samples of varying behavior

Next Steps Feature selection-which group of features work best for each classifier Additional experiments to determine why certain classifications are much better Future Use the temporal information to learn hidden emergent sub-behaviors (feature vector changing similarly over time)

Thank You