A Distributed Cooperative Framework for Continuous Multi- Projector Pose Estimation IEEE VR 2009 - March 16, 2009 Tyler Johnson, Greg Welch, Henry Fuchs,

Slides:



Advertisements
Similar presentations
Point-based Graphics for Estimated Surfaces
Advertisements

Real-Time Projector Tracking on Complex Geometry Using Ordinary Imagery Tyler Johnson and Henry Fuchs University of North Carolina – Chapel Hill ProCams.
Location Forum 2006, 7 November, 2006 School of Surveying & Spatial Information Systems The University of New South Wales, Australia Adaptive Kalman Filtering.
CSE473/573 – Stereo and Multiple View Geometry
QR Code Recognition Based On Image Processing
Cameras and Projectors
Applications Presented by: Michal Kamara. Outline Motivation-shadow removal from multi- projector displays Dynamic shadow elimination for multi- projector.
Multimedia Specification Design and Production 2012 / Semester 1 / week 6 Lecturer: Dr. Nikos Gazepidis
System Integration and Experimental Results Intelligent Robotics Research Centre (IRRC) Department of Electrical and Computer Systems Engineering Monash.
Patch to the Future: Unsupervised Visual Prediction
Presented by : …….. Ramesh Raskar, Greg Welch and Henry Fuchs University of North Carolina at Chapel Hill Presented by : …….. Ramesh Raskar, Greg Welch.
Dana Cobzas-PhD thesis Image-Based Models with Applications in Robot Navigation Dana Cobzas Supervisor: Hong Zhang.
Tracking Objects with Dynamics Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 04/21/15 some slides from Amin Sadeghi, Lana Lazebnik,
Slide 1 Tiled Display Walls - Relation to the Access Grid and Other Systems Mike Walterman, Manager of Graphics Programming, Scientific Computing and Visualization.
Adam Rachmielowski 615 Project: Real-time monocular vision-based SLAM.
A Unified Multi-Surface, Multi- Resolution Workspace with Camera-Based Scanning and Projector- Based Illumination Tyler Johnson and Henry Fuchs University.
RANSAC-Assisted Display Model Reconstruction for Projective Display Patrick Quirk, Tyler Johnson, Rick Skarbez, Herman Towles, Florian Gyarfas, Henry Fuchs.
Introduction to Kalman Filter and SLAM Ting-Wei Hsu 08/10/30.
Creating Adaptive Views for Group Video Teleconferencing – An Image-Based Approach Creating Adaptive Views for Group Video Teleconferencing – An Image-Based.
A Personal Surround Environment: Projective Display with Correction for Display Surface Geometry and Extreme Lens Distortion Tyler Johnson, Florian Gyarfas,
Flexible Bump Map Capture From Video James A. Paterson and Andrew W. Fitzgibbon University of Oxford Calibration Requirement:
Passive Object Tracking from Stereo Vision Michael H. Rosenthal May 1, 2000.
Augmented Reality: Object Tracking and Active Appearance Model
Multiple View Geometry Marc Pollefeys University of North Carolina at Chapel Hill Modified by Philippos Mordohai.
11/21/02Visualization Laboratory, Texas A&M University1 Next Generation Spatially Immersive Visualization Systems Prof. Frederic I. Parke Visualization.
Advanced Computer Vision Structure from Motion. Geometric structure-from-motion problem: using image matches to estimate: The 3D positions of the corresponding.
Jacinto C. Nascimento, Member, IEEE, and Jorge S. Marques
Non-invasive Techniques for Human Fatigue Monitoring Qiang Ji Dept. of Electrical, Computer, and Systems Engineering Rensselaer Polytechnic Institute
Overview and Mathematics Bjoern Griesbach
1 Formation et Analyse d’Images Session 7 Daniela Hall 7 November 2005.
Knowledge Systems Lab JN 8/24/2015 A Method for Temporal Hand Gesture Recognition Joshua R. New Knowledge Systems Laboratory Jacksonville State University.
Kalman filter and SLAM problem
1 Intelligent Robotics Research Centre (IRRC) Department of Electrical and Computer Systems Engineering Monash University, Australia Visual Perception.
Simple Calibration of Non-overlapping Cameras with a Mirror
Olga Zoidi, Anastasios Tefas, Member, IEEE Ioannis Pitas, Fellow, IEEE
Automatic Registration of Color Images to 3D Geometry Computer Graphics International 2009 Yunzhen Li and Kok-Lim Low School of Computing National University.
KinectFusion : Real-Time Dense Surface Mapping and Tracking IEEE International Symposium on Mixed and Augmented Reality 2011 Science and Technology Proceedings.
3D SLAM for Omni-directional Camera
High-Resolution Interactive Panoramas with MPEG-4 발표자 : 김영백 임베디드시스템연구실.
An Information Fusion Approach for Multiview Feature Tracking Esra Ataer-Cansizoglu and Margrit Betke ) Image and.
AMIPS: The Anywhere, Multipurpose Image Projection System Nicholas Lord Budirijanto Purnomo Paul Alan Roberts Johns Hopkins University Department of Computer.
Improving the Speed of Virtual Rear Projection: A GPU-Centric Architecture Matthew Flagg, Jay Summet, James M. Rehg GVU Center College of Computing Georgia.
Lec 22: Stereo CS4670 / 5670: Computer Vision Kavita Bala.
A Theory for Photometric Self-Calibration of Multiple Overlapping Projectors and Cameras Peng Song Tat-Jen Cham Centre for Multimedia & Network Technology.
Secure In-Network Aggregation for Wireless Sensor Networks
Real-Time Simultaneous Localization and Mapping with a Single Camera (Mono SLAM) Young Ki Baik Computer Vision Lab. Seoul National University.
Raquel A. Romano 1 Scientific Computing Seminar May 12, 2004 Projective Geometry for Computer Vision Projective Geometry for Computer Vision Raquel A.
Chapter 5 Multi-Cue 3D Model- Based Object Tracking Geoffrey Taylor Lindsay Kleeman Intelligent Robotics Research Centre (IRRC) Department of Electrical.
Radiometric Compensation in a Projector-Camera System Based on the Properties of the Human Visual System Dong WANG, Imari SATO, Takahiro OKABE, and Yoichi.
Su-ting, Chuang 2010/8/2. Outline Introduction Related Work System and Method Experiment Conclusion & Future Work 2.
The Metaverse: A Laboratory for Digital Media Networks Joan M. Mazur, Curriculum & Instruction Cindy Lio, Curriculum & Instruction Christopher Jaynes,
Automatic Projector Calibration Using Self-Identifying Patterns Mark Fiala Computational Video Group Institute of Information Technology National Research.
Object Tracking - Slide 1 Object Tracking Computer Vision Course Presentation by Wei-Chao Chen April 05, 2000.
Tracking with dynamics
IEEE International Conference on Multimedia and Expo.
AGENDA  Introduction  Early developments  Requirements for immersive tele conferences systems  How tele immersion works  Share table environment 
The Unscented Kalman Filter for Nonlinear Estimation Young Ki Baik.
UNC Chapel Hill David A. O’Brien Automatic Simplification of Particle System Dynamics David O’Brien Susan Fisher Ming C. Lin Department of Computer Science.
Kalman Filter and Data Streaming Presented By :- Ankur Jain Department of Computer Science 7/21/03.
Motion Estimation of Moving Foreground Objects Pierre Ponce ee392j Winter March 10, 2004.
University of Pennsylvania 1 GRASP Control of Multiple Autonomous Robot Systems Vijay Kumar Camillo Taylor Aveek Das Guilherme Pereira John Spletzer GRASP.
Active Flattening of Curved Document Images via Two Structured Beams
Configurable Display.
Multiple View Geometry
CS4670 / 5670: Computer Vision Kavita Bala Lec 27: Stereo.
Tracking Objects with Dynamics
Advanced Computer Vision
Auto-Calibration of Multi-Projector Display Walls
JFG de Freitas, M Niranjan and AH Gee
Nome Sobrenome. Time time time time time time..
Presentation transcript:

A Distributed Cooperative Framework for Continuous Multi- Projector Pose Estimation IEEE VR March 16, 2009 Tyler Johnson, Greg Welch, Henry Fuchs, Eric La Force, and Herman Towles Department of Computer Science University of North Carolina at Chapel Hill

A Distributed Cooperative Framework for Continuous Calibration2 Funding ONR: Behavior Analysis and Synthesis for Intelligent Training (BASE-IT), Dr. Roy Stripling, Program Manager ONR: Virtual Technologies and Environments Program (VIRTE), CDR Dylan Schmorrow, Program Manager ONR-NAVAIR: Deployable Intelligent Projection Systems for Training, SBIR contract with Renaissance Sciences Corporation IARPA: Mockup Future Analyst Workspace (A-Desk), Dr. Jeff Morrison, Program Manager NSF: Integrated Projector-Camera Modules for the Capture and Creation of Wide-Area Immersive Experiences, CRI:IAD grant

A Distributed Cooperative Framework for Continuous Calibration3 Adaptive Multi-Projector Displays An Intelligent Projector Unit (IPU)

A Distributed Cooperative Framework for Continuous Calibration4 Challenges Geometric Compensating for display surface shape Co-registration of projection images Photometric Intensity blending in image overlaps Matching colors between projectors No Compensation Compensation

A Distributed Cooperative Framework for Continuous Calibration5 Geometric Calibration Calibrate Projectors Estimate Display Surface Before Display Use Project Structured Light During Display Use Geometric Image Correction Render Imagery Estimate Display Surface Calibrate Projectors Concurrently Continuous Calibration

A Distributed Cooperative Framework for Continuous Calibration6 Continuous Calibration A Calibrated Two-Projector Display Projectors Moved or Bumped A Recalibrated Two-Projector Display

A Distributed Cooperative Framework for Continuous Calibration7 Cooperative Calibration We propose a distributed, Kalman filter-based approach to continuous calibration where intelligent projector units interact as peers to cooperatively estimate the poses (orientation & position) of all projectors during actual display use

A Distributed Cooperative Framework for Continuous Calibration8 Related Work Continuous Calibration Active (Calibration Patterns) Imperceptible Structured Light [Cotting04,05] Passive (Application Imagery) Continuous Display Surface Estimation [Yang&Welch01] Single Projector Pose Estimation[Johnson&Fuchs07] Multi-Projector Pose Estimation [Zhou08] Hybrid Automatic switch from passive to active [Zollmann06] Distributed Upfront Calibration [Bhasker06]

A Distributed Cooperative Framework for Continuous Calibration9 Contributions Our Kalman filter-based distributed cooperative framework provides Continuous pose estimation for multiple projectors Compatible with both active and passive feature collection All projectors may move simultaneously Temporal filtering Fault tolerance & scalability

A Distributed Cooperative Framework for Continuous Calibration10 Cooperative Calibration Peer-to-Peer based, Single Program, Multiple Data Each IPU considers itself to be the “local” IPU Other IPUs are considered “remote” IPUs Each IPU is responsible for calculating its own pose using local and remote correspondences Assumptions The internal calibration of each IPU is fixed and known The geometry of the display surface is static and known Projectors remain mostly stationary, however they may drift over time or occasionally be moved by the user

A Distributed Cooperative Framework for Continuous Calibration11 Local Correspondences Measured for each IPU between its primary and secondary cameras Provides an estimate of pose

A Distributed Cooperative Framework for Continuous Calibration12 Remote Correspondences Measured between the primary camera of the local IPU and a remote IPU Remote IPU acts as a reference in estimating pose of local IPU

A Distributed Cooperative Framework for Continuous Calibration13 Collection of Measurements Display Surface

A Distributed Cooperative Framework for Continuous Calibration14 Pose of Kalman Filter Display Surface Model Pose of Intrinsics of Measurements in Predicted Measurements in Measurements in Measurement Function Estimate, using

A Distributed Cooperative Framework for Continuous Calibration15 Kalman Filter Error Covariance State Pose of

A Distributed Cooperative Framework for Continuous Calibration16 Filter Update Predict IPUs remain stationary Add additional uncertainty Time Update Measurement Update Correct state based on residual Measurement Jacobian Measurement Noise (t)

A Distributed Cooperative Framework for Continuous Calibration17 Distributed Operation Each IPU has local access to Its own intrinsic calibration & pose Its own camera images Display surface model Kalman filter update requires remote access to Intrinsic calibration & poses of remote IPUs Error & process noise covariance of remote IPUs Images from primary cameras of remote IPUs captured at time

A Distributed Cooperative Framework for Continuous Calibration18 Distributed Operation Request/response mechanism for exchanging camera images, pose information etc

A Distributed Cooperative Framework for Continuous Calibration19 Results x y z Before Movement During Movement After Movement Ψ θ φ mm rad

A Distributed Cooperative Framework for Continuous Calibration20 Video Distributed Cooperative Pose Estimation in a Two-IPU display P P

A Distributed Cooperative Framework for Continuous Calibration21 Discussion Observability & Drift Surface geometry may not fully constrain pose Possible to “anchor” solution in unobservable directions Cooperative Estimation Not required for a working system Ensures imagery is registered between projectors, especially when pose may be unobservable P P

A Distributed Cooperative Framework for Continuous Calibration22 Future Work Continuous calibration of display surface Dynamic projector refocusing Dynamic photometric blending Improve scalability

A Distributed Cooperative Framework for Continuous Calibration23 Future Applications

A Distributed Cooperative Framework for Continuous Calibration24 In Conclusion