Download presentation
Presentation is loading. Please wait.
Published byChester Dennis Modified over 9 years ago
1
Multi-Person Multi-Camera Tracking for EasyLiving John Krumm Steve Harris Brian Meyers Barry Brumitt Michael Hale Steve Shafer Vision Technology Research Group Microsoft Research Redmond, WA USA
2
What Is EasyLiving? EasyLiving is a prototype architecture and technologies for building intelligent environments that facilitate the unencumbered interaction of people with other people, with computers, and with devices.
3
Example Behaviors Adjust lights as you move around a space Route video to best display Move your Windows session as you move Deliver e-messages to where you are Monitor a young child or old person
4
EasyLiving Demo (7 min.)
5
Self-Aware Space EasyLiving must know about people, computers, software, devices, and geometry to work right.
6
Who’s Where?
7
Person- Tracking System 5 Triclops stereo cameras 5 PCs running “Stereo Module” (and Microsoft Windows 2000) 1 PC running “Person Tracker” (only U.S. $319) (includes Internet Explorer) (as part of the OS) (for a limited time only?)
8
Triclops Cameras Now superceded by “Digiclops” digital IEEE-1394 version
9
Typical Images Color image from TriclopsDisparity image from Triclops
10
Requirements To work in a real-life intelligent environment, our tracking system must … 1.Maintain location & identity of people 2.Run at reasonable speeds (we get 3.5 Hz) 3.Work with multiple people (we handle up to three) 4.Create and delete people instances 5.Work with multiple cameras (we’re up to five) 6.Use cameras in the room 7.Work for extended periods 8.Tolerate partial occlusions and variable postures
11
Other Systems Non-Vision Olivetti Research (’92) & Xerox PARC (’93) – IR badges AT&T Laboratories (Cambridge) (’97) – Ultrasonic badges PinPoint, Ascension, Polhemus – commercial RF badges Vision (for multiple people) Haritaoglu & Davis (’98-’99) Darrell et al. (’98) Orwell et al. (’99) Collins et al. (’99) Rosales & Sclaroff (’99) Kettnaker & Zabih (’99) Intille et al. (’95, ’97) Rehg et al. (’97) Boult et al. (‘99) Stiefelhagen et al. (’99) MacCormick & Blake (’99) Cai & Aggarwal (’98) Halevi & Weinshall (’97) Gavrila & Davis (’96) “I see by the current issue of ‘Lab News,’ Ridgeway, that you’ve been working for the last twenty years on the same problem I’ve been working on for the last twenty years.”
12
Why Use Vision? Alternative sensors: Active badges Pressure-sensitive floors Motion sensors Localized sensors, e.g. on door, chair But … Cameras are getting cheap Cameras are easy to install Cameras give location and identity Cameras can find other objects, e.g. video screens Cameras can be use to model room geometry (active badge)
13
Person Detection Steps 1.Background subtraction 2.Blob clustering 3.Histogram identification Camera calibration Background modeling
14
Camera Calibration All tracking done in ground plane Record path of single person walking around room Compute ( x, y, ) that best aligns paths Requires robust alignment to deal with outliers Paths before calibrationPaths after calibration
15
Background Modeling View of space Combined color & disparity background image
16
Background Subtraction Foreground if: valid depth over invalid depth - OR - depth difference > T d - OR - any (R,G,B) difference > T c Color takes over when person sinks into couch cushions Potential problem when person walks in front of moving video (thus turn on moving video when acquiring background)
17
Person Detection Region-growing on foreground pixels gives fragmented blobs Group blobs into people-shaped clusters
18
Blob Clustering Minimum spanning tree Break really long links Find five remaining longest links Break all combinations of these five: 12345 100000 200001 300010 3011101 3111110 3211111 Covariance matrices of 3D coordinates of linked blobs Eigenvalues of covariance matrices Compare eigenvalues to person model
19
Color Histograms Identify people with RGB color histograms, 16x16x16 Each camera PC maintains its own histograms Space-variant histograms built as person moves around room Person tracker uses histogram to resolve ambiguities window Bluish tint Regular color
20
So Far calibration, background background subtraction (color & depth) blob clustering histogram maintenance
21
Person Tracking Takes reports from stereo modules Transforms to common coordinate frame (common coordinate frame)
22
Person Tracking – Steady State One “track” for each person Predicted location Resolve with color histograms Feed back results to stereo modules for histogram updating
23
Person Tracking – Bad Data Measurement Noise: Computed position based on predicted position from many reports Occlusions: Multiple cameras Long timeout on unsupported tracks
24
Person Creation Zone Tracks begin and end here Initial tracks are provisional Makes remainder of room more robust
25
Summary Live demos, 20 minutes long Person tracker runs at 3.5 Hz Up to three people in room People can: enter leave walk around stop moving sit collide
26
Recent Efforts Stop breaking the vision system! Moved chairs & changing lights bad background model Special behavior, e.g. slow through person creation zone Lots of people, e.g. around conference table Find other objects to enable interesting behaviors, e.g. “Where’s that book?” Easier method to model room geometry
27
Workshop on Multi-Object Tracking
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.