Introductory Seminar on Research: Fall 2017

Slides:



Advertisements
Similar presentations
Applications of one-class classification
Advertisements

Knowledge Transfer via Multiple Model Local Structure Mapping Jing Gao, Wei Fan, Jing Jiang, Jiawei Han l Motivate Solution Framework Data Sets Synthetic.
Pseudo-Relevance Feedback For Multimedia Retrieval By Rong Yan, Alexander G. and Rong Jin Mwangi S. Kariuki
Florida International University COP 4770 Introduction of Weka.
Image classification Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing them?
Machine learning continued Image source:
Image classification Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing them?
Active Learning with Support Vector Machines
Introduction to Machine Learning course fall 2007 Lecturer: Amnon Shashua Teaching Assistant: Yevgeny Seldin School of Computer Science and Engineering.
1 MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING By Kaan Tariman M.S. in Computer Science CSCI 8810 Course Project.
5/30/2006EE 148, Spring Visual Categorization with Bags of Keypoints Gabriella Csurka Christopher R. Dance Lixin Fan Jutta Willamowski Cedric Bray.
Pattern Recognition. Introduction. Definitions.. Recognition process. Recognition process relates input signal to the stored concepts about the object.
Semi-supervised Learning Rong Jin. Semi-supervised learning  Label propagation  Transductive learning  Co-training  Active learing.
Introduction to machine learning
CS Machine Learning. What is Machine Learning? Adapt to / learn from data  To optimize a performance function Can be used to:  Extract knowledge.
Crash Course on Machine Learning
Exercise Session 10 – Image Categorization
Person-Specific Domain Adaptation with Applications to Heterogeneous Face Recognition (HFR) Presenter: Yao-Hung Tsai Dept. of Electrical Engineering, NTU.
Processing of large document collections Part 2 (Text categorization) Helena Ahonen-Myka Spring 2006.
Mehdi Ghayoumi Kent State University Computer Science Department Summer 2015 Exposition on Cyber Infrastructure and Big Data.
An Introduction to Support Vector Machine (SVM) Presenter : Ahey Date : 2007/07/20 The slides are based on lecture notes of Prof. 林智仁 and Daniel Yeung.
Transfer Learning Task. Problem Identification Dataset : A Year: 2000 Features: 48 Training Model ‘M’ Testing 98.6% Training Model ‘M’ Testing 97% Dataset.
Transfer Learning with Applications to Text Classification Jing Peng Computer Science Department.
Classifiers Given a feature representation for images, how do we learn a model for distinguishing features from different classes? Zebra Non-zebra Decision.
Pattern Recognition April 19, 2007 Suggested Reading: Horn Chapter 14.
Methods for classification and image representation
HAITHAM BOU AMMAR MAASTRICHT UNIVERSITY Transfer for Supervised Learning Tasks.
CSC 562: Final Project Dave Pizzolo Artificial Neural Networks.
Supervised Machine Learning: Classification Techniques Chaleece Sandberg Chris Bradley Kyle Walsh.
 Effective Multi-Label Active Learning for Text Classification Bishan yang, Juan-Tao Sun, Tengjiao Wang, Zheng Chen KDD’ 09 Supervisor: Koh Jia-Ling Presenter:
Web-Mining Agents: Transfer Learning TrAdaBoost R. Möller Institute of Information Systems University of Lübeck.
Neural networks (2) Reminder Avoiding overfitting Deep neural network Brief summary of supervised learning methods.
1 Bilinear Classifiers for Visual Recognition Computational Vision Lab. University of California Irvine To be presented in NIPS 2009 Hamed Pirsiavash Deva.
Combining Models Foundations of Algorithms and Machine Learning (CS60020), IIT KGP, 2017: Indrajit Bhattacharya.
Neural networks and support vector machines
Data Mining, Machine Learning, Data Analysis, etc. scikit-learn
Machine Learning with Spark MLlib
Unsupervised Learning of Video Representations using LSTMs
Recent Trends in Text Mining
Machine Learning Models
Semi-Supervised Clustering
Convolutional Neural Network
Deep Feedforward Networks
Deep learning David Kauchak CS158 – Fall 2016.
Machine Learning overview Chapter 18, 21
Deep Learning Amin Sobhani.
Machine Learning overview Chapter 18, 21
Data Mining, Neural Network and Genetic Programming
Introduction Machine Learning 14/02/2017.
Intro to Machine Learning
Computational Models for Multimedia Pattern Recognition
Performance of Computer Vision
Transfer Learning in Astronomy: A New Machine Learning Paradigm
Machine Learning I & II.
10701 / Machine Learning.
Classification with Perceptrons Reading:
Basic machine learning background with Python scikit-learn
Deepak Kumar1, Chetan Kumar1, Ming Shao2
Object detection as supervised classification
What is Pattern Recognition?
CS 2750: Machine Learning Support Vector Machines
Classification Slides by Greg Grudic, CSCI 3202 Fall 2007
Text Categorization Assigning documents to a fixed set of categories
Generative Models and Naïve Bayes
MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING
Ensemble learning.
MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING
Data Mining, Machine Learning, Data Analysis, etc. scikit-learn
Data Mining, Machine Learning, Data Analysis, etc. scikit-learn
Word representations David Kauchak CS158 – Fall 2016.
Presentation transcript:

Introductory Seminar on Research: Fall 2017 Shayok Chakraborty Assistant Professor Department of Computer Science, FSU

What is Machine Learning ? Sensors and Actuators in today’s world generate humongous amounts of digital data Produces about 25-30 frames per second Processes about 100 peta-bytes of data per day About 300 hours of video uploaded every minute

Machine Learning to develop predictive models What is Machine Learning ? Machine Learning to develop predictive models Machine Learning is used to… Organize, categorize and make sense of large volumes of data Develop a model that is a good and useful approximation to the data Predict the future behavior of a system by making use of past data (e.g. weather forecasting)

Data Hand-Labeled by Human Experts A Typical Machine Learning System TRAINING Collect Unlabeled Data Label the Data Train ML Model Data Hand-Labeled by Human Experts TESTING Apply the trained model on unseen data to get predictions

Learning with Weak Supervision Face Recognition

Learning with Weak Supervision Medical Image Classification Data Hand-Labeled by Human Experts Cancer or non-cancer image ?? Experienced doctors are rare and busy

Data Hand-Labeled by Human Experts Learning with Weak Supervision Galaxy Classification Data Hand-Labeled by Human Experts

Data Hand-Labeled by Human Experts Learning with Weak Supervision Object Counting Pedestrians Cars Microscopic Cells Data Hand-Labeled by Human Experts

ACTIVE LEARNING

Active vs. Passive Learning Active Learning Active vs. Passive Learning World Passive Learner Output Model / Classifier Input Query Output Active Learner Model / Classifier World Response Tong and Koller, JMLR 2000

Active Learning - Applications Text Categorization Image Retrieval Spam email filter design Face Recognition Optical Character Recognition (OCR) Medical Image Classification Natural Language Parsing

Batch Mode Active Learning - Illustration Human Annotator Unlabeled pool of points Classifier update module Training Set

Active Learning Random Sampling: Select k samples at random from the unlabeled set Clustering based Sampling Cluster the unlabeled data using k-means clustering Select the cluster centers for annotation

Active Learning Margin-based Sampling Query Unlabeled Samples Query Decision Boundary Train an SVM on the labeled data Query the unlabeled samples that are close to the decision boundary

Active Learning Committee-based Sampling SVM K-NN Neural Network Train a committee of classifiers on the labeled data Use them to predict the unlabeled samples Query the samples where there is a disagreement on the class labels among the committee members

Active Learning Entropy-based Sampling Suppose we are working on a binary classification problem Train a classifier on the labeled data Apply the trained model on the unlabeled set to derive probabilities with respect to the two classes Let us say we have this: Sample 1: p1 = 0.1, p2 = 0.9 Sample 2: p1 = 0.5, p2 = 0.5 Which one to query? Query the sample which has high entropy

Performance on Face Recognition Datasets Active Learning Performance on Face Recognition Datasets

TRANSFER LEARNING / DOMAIN ADAPTATION

Domain Adaptation Examples Source Domain Target Domain

Domain Adaptation Examples Training Set (Source) Testing Set (Target)

Transfer Learning / Domain Adaptation A major assumption in traditional machine learning is that the training and test data come from the same domain and follow the same distribution In a real-world setting, training and test data can come from different probability distributions

Transfer Learning / Domain Adaptation A Psychological point of view The study of dependency of human conduct, learning or performance on prior experience C++ -> Java Maths/Physics -> Computer Science Driving a Car -> Driving a Truck Data Mining point of view The ability of a system to recognize and apply knowledge and skills learned in previous domains/tasks to novel tasks/domains, which share some commonality Given a target domain/task, how to identify the commonality between the domain/task and previous domains/tasks, and transfer knowledge from the previous domains/tasks to the target one?

Before Domain Adaptation After Domain Adaptation Transfer Learning / Domain Adaptation Subspace Alignment (ICCV 2013): Transform the source data (XS) so that it aligns well with the target data (XT) Before Domain Adaptation After Domain Adaptation Error Matrix

Before Domain Adaptation After Domain Adaptation Instance Re-weighting Transfer Learning / Domain Adaptation Instance Re-weighting: Assign weights to the source data (XS) so that the weighted source data has similar distribution as the target data (XT) Before Domain Adaptation After Domain Adaptation Instance Re-weighting Maximum Mean Discrepancy to measure the difference between S and T distributions weights

Transfer Learning / Domain Adaptation A Survey on Transfer Learning, Pan and Yang, IEEE TKDE 2009

DEEP LEARNING

Deep Learning A feature extractor maps the data to a vector representation Samples are represented as points in a high dimensional space Slide based on Andrea Vedaldi, 2014

Deep Learning Slide based on Andrea Vedaldi, 2014

Deep Learning Histogram of Oriented Gradients (HoG) – Dalal and Triggs, 2005 Slide based on Andrea Vedaldi, 2014

Deep Learning Slide based on Andrea Vedaldi, 2014

Deep Learning http://yann.lecun.com/exdb/lenet/

The Visual Chatbot Demo Deep Learning The Visual Chatbot Demo

Label Spaces Multi-Class Multi-Label Hierarchical Label Fuzzy Label

Thank You Shayok Chakraborty Assistant Professor Department of Computer Science Florida State University Lov 0264 shayok@cs.fsu.edu