Download presentation
Presentation is loading. Please wait.
1
Satellite Photogrammetry
By Vishal Mishra Geomatics Engineering Department, Department of Civil Engineering IIT Roorkee
2
WHAT?
3
Introduction Photogrammetry as classified on the basis of sensor-platform: Terrestrial or Close Range Aerial Satellite or Space If the sensing system is space borne, it is called space photogrammetry, satellite photogrammetry or extra-terrestrial photogrammetry.
4
Introduction Satellite Photogrammetry has slight variations compared to photogrammetric applications associated with aerial frame cameras. The images are taken with high- resolution CCD cameras coupled large lenses to take pictures of the ground right below them as they pass over. The amount of information dealt with is very large as images of very large scenes are taken from the satellites.
5
Introduction These satellites are capable of obtaining and relaying very large volumes of imagery data. The satellite data used for photogrammetric purposes is from sun- synchronous satellites generally.
6
How it is different from Remote Sensing?
Twin branches Space Photogrammetry is metric in nature whereas Remote-sensing is thematic in nature Interpretative Photogrammetry forms the basis of Remote Sensing
7
Examples Extraterrestrial Ikonos Image
8
WHY?
9
Advantages: Satellite Images
• The process of photographing of the land surface is continuous lasting for a period of 4 days(in reference to Quickbird ). Owing to this the most appropriate image was chosen. • The formalities for aerial photography and flight arrangement are avoided here. • The use of satellite images is considerably less expensive than the aerial pictures.
10
Advantages: Satellite Platform
High altitude with attendant wide coverage Freedom from aerodynamic motion which attend heavier than aircraft Weightlessness which permits large , rigid orbiting cameras to be constructed with less mass than would be required in a conventional aircraft
11
Advantages: Satellite Platform
The resulting opportunity to use cameras which can be unfolded or extended to large sizes with long focal lengths And the opportunity to photograph areas of earth that are accessible only with difficulty with conventional aircraft.
12
Disadvantages : Satellite Platform
Necessity of operating the camera in space environment (e.g. Vacuum, temperature , radiation , micrometeorite hazards ) The necessity to telemeter the photographic information to the ground. Problems of image motion compensation because of high speed of satellite Inertial disturbances of the orientation and stability of the camera platform caused by non-compensated motions of mechanical parts in camera-satellite system
13
HOW ?
14
General Workflow
15
DATA ACQUSITION
16
Sensor Types
17
Sensor Types A push broom scanner (along track scanner) is a technology for obtaining images with spectroscopic sensors. It is regularly used for passive remote sensing from space and in spectral analysis on production lines, for example with near-infrared spectroscopy A whisk broom or spotlight sensor (across track scanner) is a technology for obtaining satellite images with optical cameras. In a whisk broom sensor, a mirror scans across the satellite’s path (ground track), reflecting light into a single detector which collects data one pixel at a time.
18
Sensor Types The advantages of along track stereo images compared with images that are taken from adjacent orbits (across track) are that they are acquired in almost the same ground and atmospheric conditions.
19
Satellites The SPOT satellite carries two high resolution visible (HRV) sensors, each of which is a pushbroom scanner. The focal length of the camera optic is mm, length of the camera is 78mm. The Instantaneous Field of View(IFOV) is 4.1 degrees. The satellite orbit is circular, north-south and south-north, about 830 km above the Earth, and sun-synchronous. A sun-synchronous orbit is one in which the orbital rotation is the same rate as the Earth’s rotation Resolution of the images is 10m panchromatic and 20m multispectral.
20
Satellites The IRS-1C satellite has a pushbroom sensor consisting of three individual CCDs. The ground resolution of the imagery ranges between 5 to 6 meters. The focal length of the optic is approximately 982 mm. The pixel size of the CCD is 7 microns. The images captured from the three CCDs are processed independently or merged into one image and system corrected to account for the systematic error associated with the sensor.
21
Image acquisition methodology
The satellites collect the images by scanning along a line which is called the scan line. For each line scanned by the sensors of the satellites there is a unique perspective center and a unique set of rotation angles. The location of the perspective center relative to the scan line is constant for each line as the interior orientation parameters and focal length are constant for a given scan line. Since the motion of the satellite is smooth and linear over the entire length of the scene, the perspective centers of all scan lines in a scene are assumed to lie along a smooth line
22
Rotation angles
23
Perspective Centre
24
Satellite Scene The satellite exposure station is defined as the perspective center in ground coordinates for the center scan line. The image captured by the satellite is called a scene. SPOT Pan 1A scene is composed of 6000 lines. For SPOT Pan 1A imagery, each of these lines consists of 6000 pixels. Each line is exposed for 1.5 milliseconds, so it takes 9 seconds to scan the entire scene. A single pixel in the image records the light detected by one of the 6000 light sensitive elements in the camera. The physical dimension of a single CCD is 13x13 microns. Each pixel is defined by a file coordinate – Column number and row number. The center of the scene is the center pixel of the scan line. This center is the origin of the image coordinate system. The following figure depicts them:-
25
Satellite Scene A = origin of file coordinates
A-XF, A-YF=file coordinate axes C = origin of image coordinates (center of scene) C-x, C-y=image coordinate axes
26
Satellite Data The header of the data file of a SPOT scene contains ephemeris data, which provides information about the recording of the data and the satellite orbit. The data provided is: Position of the satellite in geocentric coordinates (with the origin at the center of the Earth) to the nearest second Velocity vector of the camera Rotational velocity of the camera. Attitude changes of the camera. Exact time of exposure of the center scan line of the scene. The data obtained is converted to local ground system for the triangulation.
27
Orientation Angle and Velocity Vector
The orientation angle of a satellite scene is the angle between a perpendicular to the center scan line and the North direction . Velocity vector The spatial motion of the satellite is described by the velocity vector. The real motion of the satellite above the ground is further distorted by the Earth’s rotation. The velocity vector of a satellite is the satellite’s velocity if measured as a vector through a point on the spheroid. It provides a technique to represent the satellite’s speed as if the imaged area were flat instead of being a curved surface
28
Orientation Angle and Velocity Vector
The adjacent diagram depicts the relation between orientation angle and velocity vector of a single scene. O = orientation angle C = center of the scene V = velocity vector
29
Satellite topographic mapping
Stereo satellite images are captured consecutively by a single satellite along the same orbit within a few seconds (along the track imaging technique) or by the same satellite (or different satellites) from different orbits in different dates (across the track imaging technique). the base-to-height (B/H) ratio should be close to 1 for high-quality stereo model with high elevation accuracy. Satellites : Carto-sat1, CHRIS/PROBA, EROS-A, IRS, IKONOS, MOMS-02, SPOT, and Terra ASTER
30
Satellite topographic mapping
Stereo data can be collected on same orbit, or different orbits (beware of changes) Satellite may have to be rotated to point sensor correctly Optimum base to height ratio is 0.6 to 1.0 Atmospheric effects (refraction, optical thickness) become more significant at higher look angles Different orbits Same orbit
31
Satellite topographic mapping
Light rays in a bundle defined by the SPOT sensor are almost parallel, lessening the importance of the satellite’s position. The inclination angles (incidence angles) of the cameras onboard the satellite become the critical data. Inclination is the angle between a vertical on the ground at the center of the scene and a light ray from the exposure station. This angle defines the degree of off-nadir viewing when the scene was recorded. The cameras can be tilted in increments of a minimum of 0.6 to a maximum of 27 degrees to the east (negative inclination) or west (positive inclination). A stereo scene is achieved when two images of the same area are acquired on different days from different orbits, one taken East of the other. For this to occur, there must be significant differences in the inclination angles.
32
Inclination Angle of a Stereoscene
C = center of the scene I- = eastward inclination I+ = westward inclination O1,O2= exposure stations (perspective centers of imagery
33
Nadir and Off-Nadir The scanner can produce a nadir view. Nadir is the point directly below the camera. SPOT has off-nadir viewing capability. Off-nadir refers to any point that is not directly beneath the satellite, but is off to an angle (that is, East or West of the nadir), as shown in fig:
34
Tri-stereo Imagery The Pleiades-1A and Pleiades-1B Satellite sensors can be programmed to collect Tri-Stereo Imagery for the production of high quality 1m-2m DEM's for 3D Urban and Terrain modeling. The Tri- Stereo acquisitions reveal elevation that would otherwise remain hidden in steep terrain or urban canyons in dense built-up areas.
35
DATA PROCESSING
36
MODELLING SATELLITE SENSOR ORIENTATION
Defining the camera or sensor model involves establishing the geometry of the camera/sensor as it existed at the time of image acquisition. Modelling satellite sensor motion & orientation in space is one of the preliminary tasks that should be performed for using satellite image data for any application. The orientation of the images is a fundamental step and its accuracy is a crucial issue during the evaluation of the entire system For pushbroom sensors the triangulation and photogrammetric point determination are rather different compared to standard approaches, and require special investigations on the sensor geometry and the acquisition mode For geo-referencing of imagery acquired by pushbroom sensors, geometric models have been developed General mathematical models for satellite sensor modelling are used: Rigorous or physical sensor model Rational Function Model (RFM), Direct Linear Transformation (DLT) 3D polynomial model , and 3D affine model.
37
MODELLING SATELLITE SENSOR ORIENTATION
The physical sensor model aims to describe the relationship between image and ground coordinates, according to the physical properties of the image acquisition Physical sensor model (rigorous model) can be formulated using basics of the collinearity equations that describe the relationship between a point on the ground and its corresponding location on the image Using linear array sensors (as in the case of IKONOS, QuickBird, IRS, and SPOT satellites), the collinearity equations should be written for every scanned line on the image. The Rational Function Model (RFM) is an empirical mathematical model that has been developed to approximate the relationship between the image and the object spaces. A number of GCPs are normally used to improve the accuracy obtained by the RFM
38
MODELLING SATELLITE SENSOR ORIENTATION
the 3D polynomial model can also be used to model the relationship between the image and the object spaces. The results show that the choice of the polynomial order depends on the type of terrain, available number of GCP, and the stability of the satellite sensor in space. 3D affine model can be performed by limiting the polynomial model to the first order 3D affine model has high integrity to represent the relationship between the image and the object spaces, especially when the model is applied to data obtained from highly stable satellite sensors
39
Comparison of models
40
MODELLING SATELLITE SENSOR ORIENTATION
Rigorous modeling is most accurate of all because it takes into consideration the actual physical process of image capture. It requires both inner orientation and exterior orientation parameter Inner orientation parameters are generally available through calibration process.
41
Interior Orientation Interior Orientation refers to the sensor elements calibration and the system behind the image plane When using satellite sensors such as SPOT, IRS-1C, and other generic pushbroom sensors use perspective center for each scan line, the process is referred to as internal sensor modeling. In a satellite image the Interior Orientation parameters are: Principal point on the image Focal length of the camera Optics parameters The transformation between file coordinates and image coordinates is constant.
42
Interior Orientation For each scan line, a separate bundle of light rays is defined, where, Pk = image point xk = x value of image coordinates for scan line k f = focal length of the camera Ok = perspective center for scan line k, aligned along the orbit PPk = principal point for scan line k lk = light rays for scan line, bundled at perspective center Ok
43
Exterior Orientation The exterior orientation describes the location and orientation of the bundle of rays in the object coordinate system with the 6 parameters: projection center coordinates (X0, Y0, Z0) and the rotations around the 3 axis (roll, pitch and yaw). Exterior orientation comprises position and attitude. On-board GPS receivers determine the satellite ephemeris i.e. camera position as a function of time. Star trackers and gyros determine the camera attitude as a function of time
44
Exterior Orientation Exterior orientation parameters are:
Perspective center of the center scan line Change of perspective centers along the orbit Rotation of the center scan line: roll, pitch and yaw. Change of the angles along the orbit
45
Triangulation Satellite block triangulation provides a model for calculating the spatial relationship between a satellite sensor and the ground coordinate system for each line of data This relationship is expressed as the exterior orientation In addition to fitting the bundle of light rays to the known points, satellite block triangulation also accounts for the motion of the satellite once the exterior orientation of the center scan line is determined, the exterior orientation of any other scan line is calculated based on the distance of that scan line from the center and the changes of the perspective center location and rotation angles
46
Triangulation Modified collinearity equations are used to compute the exterior orientation parameters associated with the respective scan lines in the satellite scenes Each scan line has a unique perspective center and individual rotation angles. When the satellite moves from one scan line to the next, these parameters change. Due to the smooth motion of the satellite in orbit, the changes are small and can be modeled by low order polynomial functions. Both GCPs and tie points can be used for satellite block triangulation of a stereo scene. For triangulating a single scene, only GCPs are used. In this case, space resection techniques are used to compute the exterior orientation parameters associated with the satellite. A minimum of six GCPs is necessary. 10 or more GCPs are recommended to obtain a good triangulation result. The effects of the Earth’s curvature are significant and are removed during block triangulation procedure.
47
Triangulation Ideal Point Distribution Over a Satellite Scene for Triangulatio n
48
Orthorectification Orthorectification is the process of reducing geometric errors inherent within photography and imagery. General sources of geometric errors : • camera and sensor orientation • systematic error of the camera /sensor • topographic relief displacement • Earth curvature Least squares adjustment techniques during block triangulation minimizes the errors associated with camera or sensor instability.
49
Orthorectification Additionally, the use of self-calibrating bundle adjustment (SCBA)techniques along with Additional Parameter (AP) modeling accounts for the systematic errors associated with camera interior geometry. The effects of topographic relief displacement are accounted for by utilizing a DEM during the orthorectification procedure.
50
Orthorectification The orthorectification process takes the raw digital imagery and applies a DEM and triangulation results to create an orthorectified image. Once an orthorectified image is created, each pixel within the image possesses geometric fidelity. Measurements taken off an orthorectified image represent the corresponding measurements as if they were taken on the Earth’s surface The resulting orthorectified image is known as a digital orthoimage.
51
DATA REPRESENTATION Digital Elevation Model Orthophoto
52
Digital Elevation Model
Digital representation of elevations in a region is commonly referred to as a digital elevation model(DEM). When the elevations refer to the earth’s terrain, it is appropriately referred to as a digital terrain model (DTM). When considering elevations of surfaces at or above the terrain (tree crowns, rooftops, etc.), it can be referred to as a digital surface model (DSM).
53
Digital Elevation Model
Procedure for DEM generation from stereoscopic views can be summarized as follows (Shin et al., 2003): Feature selection in one of the scenes of a stereo-pair: Selected features should correspond to an interesting phenomenon in the scene and/or the object space. Identification of the conjugate feature in the other scene: This problem is known as the matching/correspondence problem within the photogrammetric and computer vision communities. Intersection procedure: Matched points in the stereo-scenes undergo an intersection procedure to produce the ground coordinates of corresponding object points. The intersection process involves the mathematical model relating the scene and ground coordinates. Point densification: High density elevation data is generated within the area under consideration through an interpolation in- between the derived points in the previous step.
54
Use of DEM Common uses of Elevation Models include:
Extracting terrain parameters Volume Calculations Modelling water flow or mass movement (for example, landslides) Creation of relief maps Rendering of 3D visualizations Creation of physical models (including raised-relief maps) Orthorectification Reduction (terrain correction) of gravity measurements Terrain analysis in geomorphology and physical geography
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.