Presentation is loading. Please wait.

Presentation is loading. Please wait.

Photogrammetriy & Remote sensing

Similar presentations


Presentation on theme: "Photogrammetriy & Remote sensing"— Presentation transcript:

1 Photogrammetriy & Remote sensing
Faculty of Applied Engineering and Urban Planning Photogrammetriy & Remote sensing Civil Engineering Department 2nd Semester 2008/2009

2 Introduction

3 What is Remote Sensing (RS)?
Formal and comprehensive definition “The acquisition and measurement of data/information on some property(ies) of a phenomenon, object, or material by a recording device not in physical, intimate contact with the feature(s) under surveillance; techniques involve amassing knowledge pertinent to environments by measuring force fields, electromagnetic radiation, or acoustic energy employing cameras, radiometers and scanners, lasers, radio frequency receivers, radar systems, sonar, thermal devices, seismographs, magnetometers, gravimeters, and other instruments. ”

4 What is Remote Sensing (RS)?
Remote Sensing involves gathering data and information about the physical "world" by detecting and measuring radiation, particles, and fields associated with objects located beyond the immediate vicinity of the sensor device(s).

5 What is Remote Sensing (RS)?
Remote Sensing is a technology for sampling electromagnetic radiation to acquire and interpret non-immediate geospatial data from which to extract information about features, objects, and classes on the Earth's land surface, oceans, and atmosphere (and, where applicable, on the exteriors of other bodies in the solar system, or, in the broadest framework, celestial bodies such as stars and galaxies).

6 What is Remote Photogrammetry
”The science or art of obtaining reliable measurements by means of photographs.” ”Photogrammetry is the art, science, and technology of obtaining reliable information about physical objects and the environment through the processes of recording, measuring, and interpreting photographic images and patterns of electromagnetic radiant energy and other phenomena.” (ASPRS, 1980)

7 What is Photogrammetry
Definitions (2) Analog Photogrammetry Using optical, mechanical and electronical components, and where the images are hardcopies. Re-creates a 3D model for measurements in 3D space. Analytical Photogrammetry The 3D modelling is mathematical (not re-created) and measurements are made in the 2D images. Digital Photogrammetry Analytical solutions applied in digital images. Can also incorporate computer vision and digital image processing techniques. or Softcopy Photogrammetry ”Softcopy” refers to the display of a digital image, as opposed to a ”hardcopy” (a physical, tangible photo).

8 Photograph Image A scene which was detected as well as recorded on film. A scene which was detected electronically. Chemical reactions on a light sensitive film detects the intensity of the incoming energy. Generate an electrical signal proportional to the incoming energy. Simple, cheap, well known, high degree of spatial detail. Can sense in many wavelengths, data can be easily converted into digital form for automated processing. Only sense in the wavelength of 0.3 – 0.9 μm, manual interpretation. Complex, expensive sensors

9 Relationships of the Mapping Sciences as they relate to Mathematics and Logic, and the Physical, Biological, and Social Sciences Photogrammetry Photogrammetry is a subset of a much larger discipline called remote sensing.

10 A Brief History of Photogrammetry & RS
1851: Only a decade after the invention of the Daguerrotypie by Daguerre and Niepce, the french officer Aime Laussedat develops the first photogrammetrical devices and methods. He is seen as the initiator of photogrammetry. 1858: The German architect A. Meydenbauer develops photogrammetrical techniques for the documentation of buildings and installs the first photogrammetric institute in 1885 (Royal Prussian Photogrammetric Institute). 1866: The Viennese physicist Ernst Mach publishes the idea to use the stereoscope to estimate volumetric measures. 1885: The ancient ruins of Persepolis were the first archaeological object recorded photogrammetrically. 1889: The first German manual of photogrammetry was published by C. Koppe. 1896: Eduard Gaston and Daniel Deville present the first stereoscopical instrument for vectorized mapping. 1897/98: Theodor Scheimpflug invents the double projection. 1901: Pulfrich creates the first Stereokomparator and revolutionates the mapping from stereopairs. 1903: Theodor Scheimpflug invents the Perspektograph, an instrument for optical rectification.

11 A Brief History of Photogrammetry & RS
1910: The ISP (International Society for Photogrammetry), now ISPRS, was founded by E. Dolezal in Austria. 1911: The Austrian Th. Scheimpflug finds a way to create rectified photographs. He is considered as the initiator of aerial photogrammetry, since he was the first succeeding to apply the photogrammetrical principles to aerial photographs. 1913: The first congress of the ISP was held in Vienna. until 1945: development and improvment of measuring (metric) cameras and analogue plotters. 1964: First architectural tests with the new stereometric camera-system, which had been invented by Carl Zeiss, Oberkochen and Hans Foramitti, Vienna. 1964: Charte de Venise. 1968: First international Symposium for photogrammetrical applications to historical monuments was held in Paris - Saint Mand.

12 A Brief History of Photogrammetry & RS
1970: Constitution of CIPA (Comit International de la Photogrammetrie Architecturale) as one of the international specialized committees of ICOMOS (International Council on Monuments and Sites) in cooperation with ISPRS. The two main activists were Maurice Carbonnell, France, and Hans Foramitti, Austria. 1970ies: The analytical plotters, which were first used by U. Helava in 1957, revolutionate photogrammetry. They allow to apply more complex methods: aerotriangulation, bundle-adjustment, the use of amateur cameras etc. 1980ies: Due to improvements in computer hardware and software, digital photogrammetry is gaining more and more importance. 1996: 83 years after its first conference, the ISPRS comes back to Vienna, the town, where it was founded. 1996: The film starring Brad Pitt, Fight Club is an excellent example of the use of photogrammetry in film where it is used to combine live action with computer generated imagery in movie post-production. 2005: Topcon PI-3000 Image Station is launched.

13 Short History (1) Analogue Photogrammetry
(Pure optical-mechanical way, ): the large, complicated and expensive instruments could only be handled with a lot of experience photogrammetric operators. Steroscopic Transfer Instruments -- Stereoplotters that Use Diapositives

14 Automated (Analytical) Stereoplotter
Short History (2) Analytical Photogrammetry ( ): reconstruct the orientation no more analogue but algorithmic. The equipment became significantly smaller, cheaper and easier to handle,… with servo motors to provide the ability to position the photos directly by the computer. Automated (Analytical) Stereoplotter

15 Short History (3) Digital Photogrammetry
(1980-now): use digital photos and do the work directly with the computer.

16 Photogrammetry Aerial Photmgrammetry Terrestial Photogrammetry

17 Chapter (1) Concept and Fundamentals of Remote Sensing

18 Part (1)

19

20 Fundamental Principles of Electromagnetic Radiation
Wave Theory c = ν λ where: c = speed of light = 3 x 108 m s-1 ν = frequency (s-1, cycles/s, or Hz) λ = wavelength (m)

21 Finding Frequency from Wavelength
Given: λ = 0.55 μm [green light] Find: ν Solution: c = νλ ν = c / λ ν = (3 x 108 m s-1) / (0.55 x 10-6 m) ν = 5.45 x 1014 s-1

22 Finding Wavelength from requency
Given: ν = 6000 MHz = 6000 x 106 s-1 Find: λ Solution: c = νλ λ = c / ν λ = (3 x 108 m s-1) / (6 x 109 s-1) λ = 0.05 m = 5 cm [microwave, or radar wavelength]

23 Particle Theory Q = hν where Q = energy of a quantum, Joules (J)
h = Planck’s constant, x J s ν = frequency (s-1)

24 Relating Wave and Particle Theory
Q = hν and c = νλ ; ν = c / λ therefore, Q = hc / λ The longer the wavelength, the lower the energy content.

25 The Electromagnetic Spectrum
near-IR = 0.7 – 1.3 μm earth during daytime: mid-IR = 1.3 – 3.0 μm more reflected sunlight thermal IR = 3.0 – 14 μm more emitted energy Figure 1.3 The electromagnetic spectrum.

26 Visible Light

27 Nominal Regions of the Spectrum
Ultraviolet: μm Visible: μm Blue: μm Green: μm Red: μm Near Infrared: μm Photographic Infrared: μm Mid Infrared: μm Thermal Infrared: μm Microwave (Radar): 1 mm - 1 m

28 Unit Prefix Notation Multiplier Prefix Example 103 kilo (k) kilometer 10-3 milli (m) millimeter 10-6 micro (μ) micrometer 10-9 nano (n) nanometer See also inside back cover of textbook Note: one “micron” = one “micrometer”

29 Sources of Electromagnetic Radiation
Figure 1.4 Spectral distribution of energy radiated from blackbodies of various temperatures.

30 The Stefan-Boltzmann Law
M = σT4 Where: M = total radiant exitance, or emitted energy (W m-2) σ = Stefan-Boltzmann constant ( x 10-8 W m-2 K-4) T = temperature (K) Note: All temperatures must be expressed in degrees Kelvin (“kelvins”) K = °C

31 Wien’s Displacement Law
The dominant wavelength or wavelength at which a blackbody radiation curve reaches max. is related to its temp. by Wien disp. law λM = A / T Where: λM = wavelength of maximum radiant exitance (μm) A = constant (2898 μm K) T= temperature (K) Note: All temperatures must be expressed in degrees Kelvin (“kelvins”) K = °C

32 Figure 1.4 Spectral distribution of energy radiated from blackbodies of various temperatures.

33 By comparison the sun has a much higher energy peak occurs at 0
By comparison the sun has a much higher energy peak occurs at 0.5 μm our eye is sensitive to this magnitude and wavelength When the sun exist, we can observe earth features by virtue of reflected solar energy. The longer wavelength emitted by ambient earth features can be observed only with a non-photographic sensing system. The general dividing line between emitted and reflected IR wavelength is approx. 3 μm . Below this wavelength reflected predominate, above it emitted prevail. Certain sensors, such as radar system supply their own source of energy to illuminate feature of interest. These systems are termed active system in contrast passive system that sense naturally available energy. A very common example of an active system is a camera utilizing a flash . The same camera used in sun length becomes a passive sensor.

34 Atmospheric Effects on Electromagnetic Radiation

35 Figure 1.1 Electromagnetic remote sensing of earth resources.

36 Because of varied nature of atmospheric effect, we treat this subject on a sensor – by sensor basis in other chapters. Here we introduce the intensity and spectral composition of radiation available to any sensing system. These effects are caused principally through the mechanism of atmospheric scattering and absorption.

37 Atmospheric Effects Scattering Absorption • Rayleigh scatter
• Mie scatter • Nonselective scatter Absorption

38 Rayleigh Scatter Scattering by tiny particles (e.g., atmospheric molecules) with diameters much smaller than the wavelength involved. Effects are inversely proportional to the fourth power of wavelength (scatter proportional with1/λ4). Thus, blue wavelengths (0.4 to 0.5 μm) are scattered more than near-infrared wavelengths (0.7 to 0.9 μm). Occurs even on a “clear day” and is the primary cause of “haze” in aerial photographs.

39 Mie Scatter Scattering by particles with diameters approximately equal to the wavelength being sensed. Water vapor & dust are major causes of Mie scatter. Normally avoid taking aerial photographs when there is significant Mie scatter. This type tends to influence longer wavelengths compared to Rayleight scatter. Although scatter is significant in slightly overcast ones

40 Nonselective Scatter Scattering by particles whose diameters are much larger than the wavelengths being sensed. Called “nonselective” because nearly equal amounts of blue, green, and red light are scattered. Water droplets (clouds) cause nonselective scatter. Normally avoid taking aerial photographs when there is significant nonselective scatter. This scattering with respect to wavelength, in visible wavelength, equal quantities of blue , green, and red light are scattered , hence fog and clouds appear white.

41

42 Atmospheric Absorption
Atmospheric absorption results in the effective loss of energy to atmospheric constituent Absorption of energy at specific wavelengths. Primary sources are water vapor, carbon dioxide, and ozone. Wavelength ranges in which the atmosphere is particularly transmissive are referred to as “atmospheric windows”

43 Atmospheric Absorption

44 Figure 1.5 Spectral characteristics of (a) energy sources, (b) atmospheric transmittance, and (c) common remote sensing systems.

45 Electromagnetic Radiation Interactions with Surface Features
Figure 1.1 Electromagnetic remote sensing of earth resources.

46 Figure 1.6 Basic interactions between electromagnetic energy and an earth surface feature.

47 Energy Balance Incident energy = Reflected + Absorbed + transmitted

48 Energy Balance Reflectance Absorptions Transmittance

49 Reflectance

50 Types of Reflectors Figure 1.7 Specular versus diffuse reflectance.

51 Spectral Response Patterns

52 Part (2)


Download ppt "Photogrammetriy & Remote sensing"

Similar presentations


Ads by Google