Bio-inspired Learning for a Colony of Microsystems Rama Chellappa UMD.

Slides:



Advertisements
Similar presentations
Gestures Recognition. Image acquisition Image acquisition at BBC R&D studios in London using eight different viewpoints. Sequence frame-by-frame segmentation.
Advertisements

Change Detection C. Stauffer and W.E.L. Grimson, “Learning patterns of activity using real time tracking,” IEEE Trans. On PAMI, 22(8): , Aug 2000.
Sensor-Based Abnormal Human-Activity Detection Authors: Jie Yin, Qiang Yang, and Jeffrey Junfeng Pan Presenter: Raghu Rangan.
Human Identity Recognition in Aerial Images Omar Oreifej Ramin Mehran Mubarak Shah CVPR 2010, June Computer Vision Lab of UCF.
Introduction To Tracking
Learning to estimate human pose with data driven belief propagation Gang Hua, Ming-Hsuan Yang, Ying Wu CVPR 05.
Reducing Drift in Parametric Motion Tracking
Real-Time Human Pose Recognition in Parts from Single Depth Images Presented by: Mohammad A. Gowayyed.
Foreground Modeling The Shape of Things that Came Nathan Jacobs Advisor: Robert Pless Computer Science Washington University in St. Louis.
Robust Object Tracking via Sparsity-based Collaborative Model
Computer and Robot Vision I
Oklahoma State University Generative Graphical Models for Maneuvering Object Tracking and Dynamics Analysis Xin Fan and Guoliang Fan Visual Computing and.
Shape and Dynamics in Human Movement Analysis Ashok Veeraraghavan.
Event prediction CS 590v. Applications Video search Surveillance – Detecting suspicious activities – Illegally parked cars – Abandoned bags Intelligent.
End Show Slide 1 of 44 Copyright Pearson Prentice Hall Biology.
Exchanging Faces in Images SIGGRAPH ’04 Blanz V., Scherbaum K., Vetter T., Seidel HP. Speaker: Alvin Date: 21 July 2004.
Report on Intrusion Detection and Data Fusion By Ganesh Godavari.
1 Face Tracking in Videos Gaurav Aggarwal, Ashok Veeraraghavan, Rama Chellappa.
Rodent Behavior Analysis Tom Henderson Vision Based Behavior Analysis Universitaet Karlsruhe (TH) 12 November /9.
Incremental Learning of Temporally-Coherent Gaussian Mixture Models Ognjen Arandjelović, Roberto Cipolla Engineering Department, University of Cambridge.
Abandoned Object Detection for Indoor Public Surveillance Video Dept. of Computer Science National Tsing Hua University.
1 Integration of Background Modeling and Object Tracking Yu-Ting Chen, Chu-Song Chen, Yi-Ping Hung IEEE ICME, 2006.
Today Introduction to MCMC Particle filters and MCMC
Tracking Social Insects Ashok Veeraraghavan Rama Chellappa.
Learning the space of time warping functions for Activity Recognition Function-Space of an Activity Ashok Veeraraghavan Rama Chellappa Amit K. Roy-Chowdhury.
Introduction to Object Tracking Presented by Youyou Wang CS643 Texas A&M University.
Real-Time Decentralized Articulated Motion Analysis and Object Tracking From Videos Wei Qu, Member, IEEE, and Dan Schonfeld, Senior Member, IEEE.
Learning and Recognizing Activities in Streams of Video Dinesh Govindaraju.
Kalman filter and SLAM problem
Extracting Places and Activities from GPS Traces Using Hierarchical Conditional Random Fields Yong-Joong Kim Dept. of Computer Science Yonsei.
Tracking Pedestrians Using Local Spatio- Temporal Motion Patterns in Extremely Crowded Scenes Louis Kratz and Ko Nishino IEEE TRANSACTIONS ON PATTERN ANALYSIS.
Action and Gait Recognition From Recovered 3-D Human Joints IEEE TRANSACTIONS ON SYSTEMS, MAN, AND CYBERNETICS— PART B: CYBERNETICS, VOL. 40, NO. 4, AUGUST.
Prakash Chockalingam Clemson University Non-Rigid Multi-Modal Object Tracking Using Gaussian Mixture Models Committee Members Dr Stan Birchfield (chair)
Tracking by Sampling Trackers Junseok Kwon* and Kyoung Mu lee Computer Vision Lab. Dept. of EECS Seoul National University, Korea Homepage:
Zhengyou Zhang Microsoft Research Digital Object Identifier: /MMUL Publication Year: 2012, Page(s): Professor: Yih-Ran Sheu Student.
1 TEMPLATE MATCHING  The Goal: Given a set of reference patterns known as TEMPLATES, find to which one an unknown pattern matches best. That is, each.
A General Framework for Tracking Multiple People from a Moving Camera
Is It An Insect? Yes or No? Emily Heckman. What is an Insect? Is there truly a difference among bugs, arachnids, crustaceans, and insects? Is there truly.
Gait Recognition Guy Bar-hen Tal Reis. Introduction Gait – is defined as a “manner of walking”. Gait recognition – –is the term typically used to refer.
Project title : Automated Detection of Sign Language Patterns Faculty: Sudeep Sarkar, Barbara Loeding, Students: Sunita Nayak, Alan Yang Department of.
Object Stereo- Joint Stereo Matching and Object Segmentation Computer Vision and Pattern Recognition (CVPR), 2011 IEEE Conference on Michael Bleyer Vienna.
Dynamic 3D Scene Analysis from a Moving Vehicle Young Ki Baik (CV Lab.) (Wed)
Report on Intrusion Detection and Data Fusion By Ganesh Godavari.
Loris Bazzani*, Marco Cristani*†, Vittorio Murino*† Speaker: Diego Tosato* *Computer Science Department, University of Verona, Italy †Istituto Italiano.
An Information Fusion Approach for Multiview Feature Tracking Esra Ataer-Cansizoglu and Margrit Betke ) Image and.
Chapter 21 Magnetic Forces and Magnetic Fields Magnetic Fields The needle of a compass is permanent magnet that has a north magnetic pole (N) at.
Swarm Robotics Indresh Yadav.
Structure Discovery of Pop Music Using HHMM E6820 Project Jessie Hsu 03/09/05.
Using Inactivity to Detect Unusual behavior Presenter : Siang Wang Advisor : Dr. Yen - Ting Chen Date : Motion and video Computing, WMVC.
Honeybee learning and memory. Honeybee brain AL Moth AL.
Vision-based human motion analysis: An overview Computer Vision and Image Understanding(2007)
Tracking CSE 6367 – Computer Vision Vassilis Athitsos University of Texas at Arlington.
Relative Hidden Markov Models Qiang Zhang, Baoxin Li Arizona State University.
Action and Gait Recognition From Recovered 3-D Human Joints IEEE TRANSACTIONS ON SYSTEMS, MAN, AND CYBERNETICS— PART B: CYBERNETICS, VOL. 40, NO. 4, AUGUST.
Real-Time Simultaneous Localization and Mapping with a Single Camera (Mono SLAM) Young Ki Baik Computer Vision Lab. Seoul National University.
Context-based vision system for place and object recognition Antonio Torralba Kevin Murphy Bill Freeman Mark Rubin Presented by David Lee Some slides borrowed.
 Present by 陳群元.  Introduction  Previous work  Predicting motion patterns  Spatio-temporal transition distribution  Discerning pedestrians  Experimental.
Course14 Dynamic Vision. Biological vision can cope with changing world Moving and changing objects Change illumination Change View-point.
Visual Tracking by Cluster Analysis Arthur Pece Department of Computer Science University of Copenhagen
A Forest of Sensors: Using adaptive tracking to classify and monitor activities in a site Eric Grimson AI Lab, Massachusetts Institute of Technology
Context-based vision system for place and object recognition
Outline Image Segmentation by Data-Driven Markov Chain Monte Carlo
Eric Grimson, Chris Stauffer,
Institute of Neural Information Processing (Prof. Heiko Neumann •
PRAKASH CHOCKALINGAM, NALIN PRADEEP, AND STAN BIRCHFIELD
What is an Insect?.
SIMPLE ONLINE AND REALTIME TRACKING WITH A DEEP ASSOCIATION METRIC
Introduction to Object Tracking
Counting in High-Density Crowd Videos
Presentation transcript:

Bio-inspired Learning for a Colony of Microsystems Rama Chellappa UMD

Who am I? A computer vision researcher Interested in representation and recognition. Over the past 10 years heavy involvement with understanding video sequences A participant in the ARO MURI on Micro air vehicles Collaborations with Prof. M. Srinivasan.

Pattern Recognition and Bees Behavior analysis of insects has led to advances in navigation, control systems etc. Goal: To automate the tracking and labeling of insect motion i.e., track the position and the behavior of insects. Joint work with Prof. M. Sreenivasan of ANU To appear in IEEE Trans. PAMI

Anatomical Modeling All insects have similar anatomy. Hard Exoskeleton, soft interior. Three major body parts- Head, Thorax and abdomen. Each body part modeled as an ellipse. Anatomical modeling ensures –Physical limits of body parts are consistent. –Accounts for structural limitations. –Accounts for correlation among orientation of body parts –Insects move in the direction of their head.

Waggle Dance- 1 Orientation of waggle axis  Direction of Food source.(with respect to sun). Intensity of waggle dance  Sweetness of food source. Frequency of waggle  Distance of food source. Parameters of interest in the waggle dance –Waggle Axis : Average orientation of Thorax during Waggle. –Duration of Waggle : Number of frames of waggle in each segment of the dance.

Mixture Modeling for Behaviors Low level Motion states –Straight, motionless, waggle, turn. Each Behavior is a Markov model on such motion states. Switching between behaviors is modeled as another Markov Process. Detect frames of waggle dance by looking at –Rate of change of Abdomen Orientation –Average absolute motion of center of abdomen in the direction perpendicular to the axis of the bee.

Shape, Motion and Behavior Encoded Particle Filter Tracking using a particle filter. Behavioral model in addition to motion model in the normal particle filter framework. Track both position and orientation of various body parts and the behavior exhibited by the bee. Observation model: – Mixture of Gaussians. – 5 Exemplars for the appearance of the bee. Maximum Likelihood estimate for both position and behavior.

Result

The Grand Challenge - 1 More and more MAVs and Microrobots will be employed for a variety of applications. Microsystems/pupil ratio will increase at an alarming rate. We need to figure out how a colony of such systems (less than 100) can organize themselves for carrying out a few well-defined tasks. –Landmark-based navigation –Terminal guidance –Perching –Surveillance –Looking for anomalies Much is known about how honeybees carry out tasks related to navigation, perching, hunting for honey, etc. –Prof. Srinivasan and several others

The Grand Challenge – 2 Examples of problems to be studied. Sensors for MAVs and Microrobots How to keep track of other microsystems in the colony –Tracking/Tagging a large number of movers in a restricted space –Bees are always on the move. Microsystems need to move only when needed. –Keeping an account of who went out and who came in –How to organize a sub-group of micro-systems for carrying out a specific task? –Activity recognition –Waggle dancing by MAVs! How to nourish microsystems? –Power, communication issues –Self-evaluation of their well being –Call for help. How will humans interact/interface with the colony?