CSCI 5822 Probabilistic Models of Human and Machine Learning

Slides:



Advertisements
Similar presentations
Pat Langley Computational Learning Laboratory Center for the Study of Language and Information Stanford University, Stanford, California
Advertisements

Artificial Intelligence Presentation
Neural networks Introduction Fitting neural networks
Signal and System I The inverse z-transform Any path in the ROC to make the integral converge Example ROC |z|>1/3.
Multi-Task Compressive Sensing with Dirichlet Process Priors Yuting Qi 1, Dehong Liu 1, David Dunson 2, and Lawrence Carin 1 1 Department of Electrical.
Optimal resampling using machine learning Jesse McCrosky.
19 April, 2017 Knowledge and image processing algorithms for real-life applications. Dr. Maria Athelogou Principal Scientist & Scientific Liaison Manager.
KIT – University of the State of Baden-Württemberg and National Laboratory of the Helmholtz Association Institute for Data Processing and Electronics.
CS598CXZ Course Summary ChengXiang Zhai Department of Computer Science University of Illinois, Urbana-Champaign.
July 11, 2001Daniel Whiteson Support Vector Machines: Get more Higgs out of your data Daniel Whiteson UC Berkeley.
Parameter selection in prostate IMRT Renzhi Lu, Richard J. Radke 1, Andrew Jackson 2 Rensselaer Polytechnic Institute 1,Memorial Sloan-Kettering Cancer.
The “Assembly Line” for the Information Age Human-Computer Cooperation for Large-Scale Product Classification Jianfu Chen Computer Science Department,
Mining Discriminative Components With Low-Rank and Sparsity Constraints for Face Recognition Qiang Zhang, Baoxin Li Computer Science and Engineering Arizona.
CS6700 Advanced AI Bart Selman. Admin Project oriented course Projects --- research style or implementation style with experimental component. 1 or 2.
DTU Medical Visionday May 27, 2009 Generative models for automated brain MRI segmentation Koen Van Leemput Athinoula A. Martinos Center for Biomedical.
 The most intelligent device - “Human Brain”.  The machine that revolutionized the whole world – “computer”.  Inefficiencies of the computer has lead.
Scientific Writing Abstract Writing. Why ? Most important part of the paper Number of Readers ! Make people read your work. Sell your work. Make your.
An Instructable Connectionist/Control Architecture: Using Rule-Based Instructions to Accomplish Connectionist Learning in a Human Time Scale Presented.
Ant Colony Optimization. Summer 2010: Dr. M. Ameer Ali Ant Colony Optimization.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition Objectives: Reestimation Equations Continuous Distributions.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition Objectives: Reestimation Equations Continuous Distributions.
Outline Transmitters (Chapters 3 and 4, Source Coding and Modulation) (week 1 and 2) Receivers (Chapter 5) (week 3 and 4) Received Signal Synchronization.
THE VAN PERSIE PROBLEM Anurag Misra © Garuna Productions MENTOR: DR. AMITABHA MUKHERJEE Anurag Misra
CSE 5331/7331 F'07© Prentice Hall1 CSE 5331/7331 Fall 2007 Machine Learning Margaret H. Dunham Department of Computer Science and Engineering Southern.
Discovering Optimal Training Policies: A New Experimental Paradigm Robert V. Lindsey, Michael C. Mozer Institute of Cognitive Science Department of Computer.
APPLICATIONS OF DIRICHLET PROCESS MIXTURES TO SPEAKER ADAPTATION Amir Harati and Joseph PiconeMarc Sobel Institute for Signal and Information Processing,
Optimal Relay Placement for Indoor Sensor Networks Cuiyao Xue †, Yanmin Zhu †, Lei Ni †, Minglu Li †, Bo Li ‡ † Shanghai Jiao Tong University ‡ HK University.
Perfect recall: Every decision node observes all earlier decision nodes and their parents (along a “temporal” order) Sum-max-sum rule (dynamical programming):
COMPUTE INVERSE KINEMATICS IN A ROBOTIC ARM BY USING FUZZY LOGIC Subject: Robotics Applications Student: Bui Huy Tien Student ID: M961Y204.
Neural networks (2) Reminder Avoiding overfitting Deep neural network Brief summary of supervised learning methods.
Deep Learning Overview Sources: workshop-tutorial-final.pdf
Brief Intro to Machine Learning CS539
Big data classification using neural network
A probabilistic approach to cognition
Abstraction and Refinement for Large Scale Model Checking
LECTURE 01: Introduction to Algorithms and Basic Linux Computing
CCSI 5922 Neural Networks and Deep Learning: Introduction 1
Deep Learning Amin Sobhani.
Deep Learning.
Diagram of Neural Network Forward propagation and Backpropagation
Jeremy Watt and Aggelos Katsaggelos Northwestern University
Learning Deep L0 Encoders
Generative Adversarial Networks
Spring Courses CSCI 5922 – Probabilistic Models (Mozer) CSCI Mind Reading Machines (Sidney D’Mello) CSCI 7000 – Human Centered Machine Learning.
Machine Learning I & II.
CSC 594 Topics in AI – Natural Language Processing
Unrolling: A principled method to develop deep neural networks
Announcements HW4 due today (11:59pm) HW5 out today (due 11/17 11:59pm)
Are End-to-end Systems the Ultimate Solutions for NLP?
CSCI 5922 Neural Networks and Deep Learning: NIPS Highlights
"Playing Atari with deep reinforcement learning."
CSCI 5822 Probabilistic Models of Human and Machine Learning
NESTA: A Fast and Accurate First-Order Method for Sparse Recovery
CSCI 5822 Probabilistic Models of Human and Machine Learning
COS 518: Advanced Computer Systems Lecture 12 Mike Freedman
CSCI 5822 Probabilistic Models of Human and Machine Learning
CellNetQL Image Segmentation without Feature Definition
CSCI 5822 Probabilistic Models of Human and Machine Learning
CS 4501: Introduction to Computer Vision Training Neural Networks II
CSCI 5822 Probabilistic Models of Human and Machine Learning
CSCI 5822 Probabilistic Models of Human and Machine Learning
network of simple neuron-like computing elements
Hello Welcome to Statistics I – STAB22 Section: LEC01 Tuesdays and Fridays: 12 to 1 pm. Room: IC 130 Fall 2014.
TensorFlow: A System for Large-Scale Machine Learning
Deep Learning Authors: Yann LeCun, Yoshua Bengio, Geoffrey Hinton
By Todd Richards, PhD A search of Plasticity found 53 abstracts
Multiple features Linear Regression with multiple variables
Multiple features Linear Regression with multiple variables
Master of Science in Data Science
Outline Announcement Neural networks Perceptrons - continued
Presentation transcript:

CSCI 5822 Probabilistic Models of Human and Machine Learning Mike Mozer Department of Computer Science and Institute of Cognitive Science University of Colorado at Boulder

Variational Inference

Talk Today WHO: Ali Mousavi, PhD Candidate, Rice University WHERE: DLC 170 WHEN: TUESDAY, MARCH 20 | 3:30-5 PM ABSTRACT: Great progress has been made on sensing, perception, and signal processing over the last decades through the design of algorithms matched to the underlying physics and statistics of the task at hand. However, a host of difficult problems remain where the physics-based approach comes up short; for example, unrealistic image models stunt the performance of MRI and other computational imaging systems. Fortunately, the big data age has enabled the development of new kinds of machine learning algorithms that augment our understanding of the physics with models learned from large amounts of training data. In this talk, I will overview three increasingly integrated physics+data algorithms for solving the kinds of inverse problems encountered in computational sensing. At the lowest level, data can be used to automatically tune the parameters of an optimization algorithm (e.g., the regularization parameter in Lasso/ tuning parameters in AMP); improving its inferential and computational performance. At the next level, data can be used to learn a more realistic signal model that boosts the performance of an iterative recovery algorithm (e.g., convergence time, error). At the highest level, data can be used to train a deep network to encapsulate the complete underlying physics of the sensing problem (i.e., not just the signal model but also the forward model that maps signals into measurements). As we will see, moving up the physics+data hierarchy increasingly exploits training data and boosts performance accordingly. I will illustrate with computational sensing applications in compressive image recovery.