Precise Georeferencing of Long Strips of Quickbird Imagery Dr Mehdi Ravanbakhsh Dr Clive Fraser WALIS Forum.

Slides:



Advertisements
Similar presentations
Digital Image Processing
Advertisements

Geometry of Aerial Photographs
3D reconstruction.
Institut für Elektrische Meßtechnik und Meßsignalverarbeitung Professor Horst Cerjak, Augmented Reality VU 2 Calibration Axel Pinz.
Using DSS digital camera images and DТM from ALS50 laser scanner for creating orthomosaics in PHOTOMOD system Dmitry V. Кochergin (Racurs), Maxim V. Skorniakov.
September 17-20, 2007, Nessebar, Bulgaria
Digital Camera and Computer Vision Laboratory Department of Computer Science and Information Engineering National Taiwan University, Taipei, Taiwan, R.O.C.
Computer vision: models, learning and inference
VIIth International Scientific and Technical Conference From Imagery to Map: Digital Photogrammetric Technologies ADS40 imagery processing using PHOTOMOD:
P.1 JAMES S. Bethel Wonjo Jung Geomatics Engineering School of Civil Engineering Purdue University APR Sensor Modeling and Triangulation for an.
A Versatile Depalletizer of Boxes Based on Range Imagery Dimitrios Katsoulas*, Lothar Bergen*, Lambis Tassakos** *University of Freiburg **Inos Automation-software.
Camera Models A camera is a mapping between the 3D world and a 2D image The principal camera of interest is central projection.
Motion Tracking. Image Processing and Computer Vision: 82 Introduction Finding how objects have moved in an image sequence Movement in space Movement.
Multi video camera calibration and synchronization.
MSU CSE 240 Fall 2003 Stockman CV: 3D to 2D mathematics Perspective transformation; camera calibration; stereo computation; and more.
CS485/685 Computer Vision Prof. George Bebis
Computing motion between images
3D orientation.
COMP322/S2000/L221 Relationship between part, camera, and robot (cont’d) the inverse perspective transformation which is dependent on the focal length.
Page 1 1 of 20, EGU General Assembly, Apr 21, 2009 Vijay Natraj (Caltech), Hartmut Bösch (University of Leicester), Rob Spurr (RT Solutions), Yuk Yung.
The Pinhole Camera Model
CV: 3D sensing and calibration
Projected image of a cube. Classical Calibration.
GTECH 201 Session 08 GPS.
Jan 19, ‘11 Block Adjustment of Cartosat-I Stereo Data Using RPCs MURALI MOHAN M O BITERRA SOLUTIONS (INDIA) PRIVATE LIMITED.
Accuracy Assessment. 2 Because it is not practical to test every pixel in the classification image, a representative sample of reference points in the.
Image Processing & GIS Integration for Environmental Analysis School of Electrical & Electronic Engineering The Queen’s University of Belfast Paul Kelly.
Curve Modeling Bézier Curves
8/29/2015 GEM Lecture 14 Content Orientation in analytical plotters.
HJ-1A/B CCD IMAGERY Geometric Distortions and Precise Geometric Correction Accuracy Analysis Changmiao Hu, Ping Tang
COMP 175: Computer Graphics March 24, 2015
Geometric Correction It is vital for many applications using remotely sensed images to know the ground locations for points in the image. There are two.
1 SVY207: Lecture 18 Network Solutions Given many GPS solutions for vectors between pairs of observed stations Compute a unique network solution (for many.
What we didn’t have time for CS664 Lecture 26 Thursday 12/02/04 Some slides c/o Dan Huttenlocher, Stefano Soatto, Sebastian Thrun.
WASET Defence, Computer Vision Theory and Application, Venice 13 th August – 14 th August 2015 A Four-Step Ortho-Rectification Procedure for Geo- Referencing.
Mapping forest plots: An efficient method combining photogrammetry and field triangulation/trilateration MARV1 June 2007.
Universität Hannover Institut für Photogrammetrie und GeoInformation Issues and Method for In-Flight and On-Orbit Calibration (only geometry) Karsten Jacobsen.
Course 12 Calibration. 1.Introduction In theoretic discussions, we have assumed: Camera is located at the origin of coordinate system of scene.
Orthorectification using
Integral University EC-024 Digital Image Processing.
Image Preprocessing: Geometric Correction Image Preprocessing: Geometric Correction Jensen, 2003 John R. Jensen Department of Geography University of South.
Integration of sensors for photogrammetry and remote sensing 8 th semester, MS 2005.
3D Sensing and Reconstruction Readings: Ch 12: , Ch 13: , Perspective Geometry Camera Model Stereo Triangulation 3D Reconstruction by.
CS654: Digital Image Analysis Lecture 8: Stereo Imaging.
Self-Calibration and Metric Reconstruction from Single Images Ruisheng Wang Frank P. Ferrie Centre for Intelligent Machines, McGill University.
1 General Camera ©Anthony Steed Overview n Simple camera is limiting and it is necessary to model a camera that can be moved n We will define.
Geometric Camera Models
Ayman F. Habib, 2010 LiDAR Calibration and Validation Software and Processes Department of Geomatics Engineering University.
Digital Image Processing Definition: Computer-based manipulation and interpretation of digital images.
Introduction to Soft Copy Photogrammetry
26/10/20051 BARREL ALIGNMENT: LIST OF PARAMETERS REQUIRED FOR THE POSITIONING AND THE CALIBRATION OF THE VARIOUS ELEMENTS Praxial system: - Platforms -
Principle Component Analysis (PCA)
1 Chapter 2: Geometric Camera Models Objective: Formulate the geometrical relationships between image and scene measurements Scene: a 3-D function, g(x,y,z)
Review on Graphics Basics. Outline Polygon rendering pipeline Affine transformations Projective transformations Lighting and shading From vertices to.
Computer vision: models, learning and inference M Ahad Multiple Cameras
MISR Geo-registration Overview Brian E. Rheingans Jet Propulsion Laboratory, California Institute of Technology AMS Short Course on Exploring and Using.
Presented by: Idan Aharoni
ST236 Site Calibrations with Trimble GNSS
1 Past Applications of the PRM. 2 Simulated Oblique Aerial Frame Photographs Block adjustment (H = 700 m) Both approaches show high performances in this.
Digital Image Processing Additional Material : Imaging Geometry 11 September 2006 Digital Image Processing Additional Material : Imaging Geometry 11 September.
Enhancing Resilience in a Changing Climate/ Renforcer la résilience en face de changements climatiques Earth Sciences Sector /Secteur des Sciences de la.
Mirza Muhammad Waqar GEOREFERENCING OF IMAGES BY EXPLOITING GEOMETRIC DISTORTIONS IN STEREO IMAGES OF UK DMC 1 Final Defense.
Automated Geo-referencing of Images Dr. Ronald Briggs Yan Li GeoSpatial Information Sciences The University.
Honors Geometry Transformations Section 3 Translations and Compositions.
Younis H. Karim, AbidYahya School of Computer University Malaysia Perlis 1.
Geometric Preprocessing
Automatically Collect Ground Control Points from Online Aerial Maps
Pergeseran Relief & Triangulasi Udara
3D reconstruction class 11
DIGITAL PHOTOGRAMMETRY
Presentation transcript:

Precise Georeferencing of Long Strips of Quickbird Imagery Dr Mehdi Ravanbakhsh Dr Clive Fraser WALIS Forum 2013, Perth, 7-8 November

Outline Motivation for Long Strip Adjustment & Previous Experience with ALOS PRISM Sensor Sensor Orientation Models, ‘physical’ & RPCs Evaluation of Adjustments of Long QB Strips >240 km Concluding Remarks

Motivation for HRSI Strip Adjustment Previous work: Geoscience Australia (GA) generated orthorectified ALOS PRISM mosaic of Australia – Reduction of GCPs by >95% without significant loss of accuracy, ie 500 GCPs vs 30,000+ GCPs

Motivation for HRSI Strip Adjustment Previous solution: processing of individual scenes Main bottleneck in the production line: Determination of GCPs Also, a similar problem exists in mapping from HRSI in remote or hostile regions where GCPs are not easy to establish The need for strip adjustment of HRSI images

Rational functions for sensor orientation of HRSI Object-to-image space RPC transformation is from offset normalised latitude, longitude & height to offset normalized line & sample coordinates

RPCs not always available or economically accessible Task to implement sensor models for various pushbroom scanners, eg: SPOT 5, QuickBird, WorldView and ALOS / PRISM Vendors use different models for the sensor geometry even though the geometry is largely the same Vendors often provide high-quality metadata describing orbital position and interior orientation parameters RPCs are not appropriate for long image strips Development of a generic sensor model for pushbroom scanners incorporating mapping of vendor-specific definitions to the model Practical Motivation for a Generic, Rigorous Sensor Model

XOXO ZOZO YOYO Object coordinate system [X ECS ] – Earth-centred, cartesian system Orbital coordinate system [X O ] – Origin is satellite position S(t) – X O is nearly parallel to velocity v(t c ) – Y O is parallel to S(t c ) x v(t c ) – Z O is parallell to S(t c ) Platform coordinate system [X P ] – Fixed relative to the satellite – Time-dependant rotations roll(t), pitch(t), yaw(t) Rigorous Sensor Model: Coord. Systems & Transformations Satellite orbit X ECS Y ECS Z ECS EC NP XOXO ZOZO YOYO XPXP ZPZP YPYP XPXP ZPZP YPYP XPXP ZPZP YPYP XPXP ZPZP YPYP XPXP ZPZP YPYP XPXP ZPZP YPYP P ECS = S(t) + R O (t c )·P O P O = R P (t)·P P S(t)

Camera coordinate system [X C ] ̶Considers camera mount in platform system ̶Origin is projection centre C ̶X C is parallel to CCD array,Y C to focal plane Framelet coordinate system [X F ] ̶Origin is “leftmost” pixel of CCD array ̶Shifted by inner orientation (X F 0, Y F 0, f) Image file coordinate system [X I ] ̶X I = X F, Y I is the row index, t = t(0) +  t * Y I YFYF XFXF ZFZF YCYC XCXC ZCZC YPYP XPXP ZPZP P P = C M + R M ·P C P C = ·(p F -c F +  x) p I (X I, Y I, 0)  p F (X F, 0, 0) and time t XIXI YIYI X F (t) C Rigorous Sensor Model: Coord. Systems & Transformations

“Real” orbit Approximated orbit SiSi SjSj SkSk Final transformation between framelet system and ECS Correction of systematic errors in Orbit S(t) and attitudes R P (t) – S(t) and attitudes  P (t) with R P (  P (t)) modelled by cubic splines – Image observations relate to “real” orbit S(t) and attitudes  P (t) – Observed orbit points are direct observations for S(t) and  P (t) – unknown correction  S modelled as offset or time-dependant term (for path and attitudes) Bundle adjustment using ground control points and vendor-provided observations to determine the corrections P ECS = S(t) + R O ·R P (t) · [C M + ·R M ·(p F – c F +  x )] Rigorous Sensor Model: Coord. Systems & Transformations

Transformation between framelet system and ECS – Matrix R Q T transforms direct from ECS to platform system with Z-axis already pointing towards the target – Position p 0 F of framelet coordinate system and corrections dx in camera system – Substitution yields: – Transforming every discrete quaternion tuple for R Q T (t) to R P (t) delivers roll(t), pitch(t) and yaw(t) – Now fits the form of the generic model (after Z-rotations applied to p 0 F & dx) Vendor specific adjustments / Quickbird P ECS = S(t) + R Q T (t) · [C MQ + ·R MQ T · (p F +p 0 F +  x )] R M = R O ·R Q T (t c )·R MQ T R P (t) = R O T ·R Q T (t)·R Q T (t c )·R O C M = R O T ·R Q T (t c )·C MQ P ECS = S(t) + R O ·R P (t) · [C M + ·R M ·(p F – c F +  x )]

HRSI Strip Adjustment Adjustment of orbit & attitude data, essentially via ‘resection’ Use of 4-8 GCPs at strip ends only No photogrammetry tie points used in non-stereo strips Adjustment for full strip length; bias-corrected RPCs then generated for single scenes

Strip adjustment: merged camera and orbit data Pushbroom scanner scene 2 Orbit Path Orbit Attitudes Camera Mounting Camera Pushbroom scanner scene 1 Orbit Path Orbit Attitudes Camera Mounting Camera Camera Mounting Camera Orbit Path Orbit Attitudes Orbit Path Orbit Attitudes Modularised sensor model Individual scenes can share components ̶ Internal camera parameters ̶ Exterior orientation  Strip adjustment Merged Orbit Attitudes Merged Orbit Path One set of EO parameters One set of bias correction parameters per strip Bridging of scenes without ground control Camera Mounting Camera Camera Mounting Camera

Long-Strip Adjustment of Quickbird Imagery Strip 1: 16 scene 240km long Nth-Sth, constant sensor azimuth 143 GPS ground survey points Strip 2: 13 scene 188km long NE-SW, varying sensor azimuth 87 GPS ground survey points Scanned from NE to SW Strip 1 Strip 2 Melbourne

Quickbird: Strip 1 Results N Geopositioning accuracy to 1.2 pixels cross-track & 2.3 pixels along- track with only 4 endpoint GCPs GCP sets No. of CKPs RMS(E) m RMS(N) m RMS of residuals (m) Max(E) m Max(N) m Set 1: 4 GCPs Set 2: 4 GCPs Set 3: 6 GCPs Set 4: 6 GCPs Set 5: 6 GCPs Set 6: 6 GCPs; 2 x top, mid, bottom Set 7: 6 GCPs; 2 x top, mid, bottom Set 8: 20 GCPs along the strip Set 9: 20 GCPs along the strip scenes

Quickbird: Orbital Translation (bias) GCP sets Shift-X (m) Shift-Y (m) Shift-Z (m) Set 1: 4 GCPs Set 2: 4 GCPs Set 3: 6 GCPs Set 4: 6 GCPs Set 5: 6 GCPs Set 6: 6 GCPs; 2 x top, mid., bottom Set 7: 6 GCPs; 2 x top, mid., bottom Set 8: 20 GCPs along the strip Set 9: 20 GCPs along the strip Per scene bias computation Scene NoShift-X (m)Shift-Y (m) No of GCPs Scene Scene Scene Scene Scene Scene Scene Scene Scene Scene Scene Scene Scene Scene Scene Estimated Shifts for different GCP configuration

N Quickbird: Strip 2 Results 13 scenes GCP sets No. of CKPts RMS(E) m RMS(N) m Set 1: 4 GCPs Set 2: 4 GCPs Set 3: 6 GCPs Set 4: 6 GCPs Set 5: 6 GCPs Set 6: 6 GCPs Set 7: 6 GCPs RPCs for set Set 8:6 GCPs; ; 2 x top, mid., bottom Set 9: 6 GCPs; 2 x top, mid., bottom Set 10: 20 GCPs along the strip Set 11: 20 GCPs along the strip Maximum  E &  N residuals of 13 pixels for 4 & 6 GCPs and 4 pixels for 20 GCPs

N Quickbird: Strip 2 Results 13 scenes GCP sets No. of CKPts RMS(E) m RMS(N) m Set 1: 4 GCPs Set 2: 4 GCPs Set 3: 6 GCPs Set 4: 6 GCPs Set 5: 6 GCPs Set 6: 6 GCPs Set 7: 6 GCPs RPCs for set Set 8:6 GCPs; ; 2 x top, mid., bottom Set 9: 6 GCPs; 2 x top, mid., bottom Set 10: 20 GCPs along the strip Set 11: 20 GCPs along the strip Maximum  E &  N residuals of 13 pixels for 4 & 6 GCPs and 4 pixels for 20 GCPs

Quickbird: Strip 2 Results GCP sets Shift-X (m) Shift-Y (m) Shift-Z (m) Set 1: 4 GCPs Set 2: 4 GCPs Set 3: 6 GCPs Set 4: 6 GCPs Set 5: 6 GCPs Set 6: 6 GCPs Set 7: 6 GCPs Set 8:6 GCPs; 2 x top, mid., bottom Set 9: 6 GCPs; 2 x top, mid., bottom Set 10: 20 GCPs along the strip Set 11: 20 GCPs along the strip Scene No Shift-X (m) Shift-Y (m) No of GCPs Scene Scene Scene Scene Scene Scene Scene Scene Scene Scene Scene Scene Scene Per scene bias computation Estimated Shifts for different GCP configuration

Strip adjustment can achieve reduction of GCPs by >95% without signifcant loss of accuracy Concept proven at Geoscience Australia & led to automated production of AGRI Results thus far are better for non-agile satellites than for steerable HRSI systems Concluding Remarks on Long Strip Adjustment

Thank you