1 Human Role in Lunar Landing Charles M. Oman, Ph.D. Director, Man Vehicle Laboratory Massachusetts Institute of Technology Sensorimotor Adaptation Research.

Slides:



Advertisements
Similar presentations
Ex. 8 - Descending Ex. 8 - Descending.
Advertisements

Human-Computer Interaction
Lunar Landing GN&C and Trajectory Design Go For Lunar Landing: From Terminal Descent to Touchdown Conference Panel 4: GN&C Ron Sostaric / NASA JSC March.
VISUAL ILLUSIONS SSG RODRIGUEZ REFERENCES: FM 1-301, TC
EFVS TGL meeting, C.J.A.A, Nov 2006, 3rd Marc JULIÉ EFVS certification aspects for operational credit C.J.A.A Amsterdam November 3rd, 2006.
Maj Cody Allee / Tom Hanrahan Embedded Terrain Awareness Warning System (eTAWS) Adventures in testing a CFIT protection system Got Protection?
PERCEPTION Chapter 4.5. Gestalt Principles  Gestalt principles are based on the idea that the whole is greater than the sum of the parts.  These principles.
Imaging Science FundamentalsChester F. Carlson Center for Imaging Science The Human Visual System Part 2: Perception.
Lecture 9: Ground Proximity Warning System (GPWS)
Multi-Engine Training And The PTS GHAFI John Sollinger/Larry Hendrickson October 28, 2000.
P-2028 Minnesota Wing Aircrew Training: Tasks P-2028 Crew Resource Management.
IFR Decision Making.
ILLUSION OF DEPTH.
Panel 5: Simulations and Training Go for Lunar Landing: From Terminal Descent to Touchdown March 5, Tempe, AZ Henry Hoeh Northrop Grumman Corporation.
Imaging Science FundamentalsChester F. Carlson Center for Imaging Science Binocular Vision and The Perception of Depth.
1 Ames Research Center Karl Bilimoria 5 March 2008 Lunar Lander Handling Qualities – Terminal Descent to Touchdown Dr. Karl Bilimoria NASA Ames Research.
National Aeronautics and Space Administration Lunar Mapping and Modeling Project Summary and Status Go for Lunar Landing Conference Tempe,
Autonomous Landing Hazard Avoidance Technology (ALHAT) Page 1 March 2008 Go for Lunar Landing Real-Time Imaging Technology for the Return to the Moon Dr.
Page No. 1 6/27/2015 On The Need for Lunar Lander Simulations: A Human Factors Perspective Robert S. McCann Human-Systems Integration Division NASA Ames.
Navigation Systems for Lunar Landing Ian J. Gravseth Ball Aerospace and Technologies Corp. March 5 th, 2007 Ian J. Gravseth Ball Aerospace and Technologies.
Lesson 2 The Earth-Sun-Moon System
Hypersonic Reentry Dynamics Faculty Advisors Professor Mease (UC Irvine) Dr. Helen Boussalis (CSULA) Student Assistants Katie Demko Shing Chi Chan 7/12/2015NASA.
Motions of the Earth and Sky Part II. Depending on the relative sizes and distances of the Sun and a moon, you might see an eclipse like this: Sun planet.
Understanding Visual Illusions
Stabilized Constant Descent Angle NPA’s
Essential Question: Why do objects seem to move across the sky?
The Earth-Moon-Sun System
Human Factors of Remotely Piloted Aircraft Alan Hobbs San Jose State University/NASA Ames Research Center.
Exploring Space CHAPTER the BIG idea People develop and use technology to explore and study space. Some space objects are visible to the human eye. Telescopes.
And Eclipses The Moon. What is the Moon? The moon is a natural satellite and reflects light from the sun. The moon is a natural satellite and reflects.
BASIC DRAWING SKILLS 6 th Grade Art & Introduction to Art Ms. McDaniel.
Size Comparison How did we get there and why? The Moon and Earth to shown scale and distance.
Conspiracy Theory Did the Americans really land on the moon?
Lecture 9: Ground Proximity Warning System (GPWS)
Office of Aviation Safety US Airways Flight 1549 Ditching on the Hudson River January 15, 2009 Katherine Wilson, Ph.D. Human Performance Capt. David Helson.
Introduction to the Principles of Aerial Photography
Shappell, 1996 VISION AND VISUAL ILLUSIONS IN FLIGHT 1053 LCDR Scott A. Shappell Naval Safety Center.
CHAPTER 4 – SENSATION AND PERCEPTION SECTION 1 – SENSATION AND PERCEPTION: THE BASICS Objective: DISTINGUISH BETWEEN SENSATION AND PERCEPTION, AND EXPLAIN.
Collision Avoidance Aviation Safety-Education Seminar Presented By Wesley Treco Flight Training Manager Embry-Riddle Aeronautical University.
Introduction to Control / Performance Flight.
USGS DIGITAL TERRAIN MODELS AND MOSAICS FOR LMMP M. R. Rosiek, E. M. Lee, E. T. Howington-Kraus, R. L. Fergason, L. A. Weller, D. M. Galuszka, B. L. Redding,
Portrait Lighting Source: should-know/
Measures and Models of Aviation Display Clutter
Lecture 9 Ground Proximity Warning System (GPWS) Radio Aids & Navigational System.
Perception 1. Inattentional Blindness Challenge: Count the number of passes the white shirts pass! VideoVideo (2mins) Video Type of selective attention.
Perception The process of organizing and interpreting information, enabling us to recognize meaningful objects and events.
Perceptual Constancy Module 19. Perceptual Constancy Perceiving objects as stable or constant –having consistent lightness, color, shape, and size even.
OZ Human-Centered Flight Displays
Anaglyph overview stereoscopic viewing technology.
Go For Lunar Conference
1 Use or disclosure of this information is subject to the restriction on the title page of this document. Flight Symbology to Aid in Approach and Landing.
Lecture 9: Ground Proximity Warning System (GPWS).
By the Brown Team Module 2. Driver Preparation Procedures Always check for small children and pets, fluid leaks, tire inflation, obvious physical damage,
Notes 2-3 The moon and eclipses 2/18/09. The moon does not glow. The moon is bright in the sky because it is lit up by the sun and reflecting the sun’s.
Eclipse ! Hey! You’re blocking my light!. Eclipse Terminology Solar vs. lunar Solar vs. lunar Total, partial, annular Total, partial, annular Umbra, penumbra,
Air Line Pilots Association, International IESALC Denver October19,2015 LED Lights In Runway And Approach Lighting Arrays A Pilot’s Perspective Captain.
Lesson 1: Reflection and its Importance
Myers’ PSYCHOLOGY (7th Ed) Chapter 6 Perception James A. McCubbin, PhD Clemson University Worth Publishers.
Perception and memory 1- Perception 2- Memory. What is perception? A process by which the brain analyses and makes sense out of incoming sensory information.
NIGHT TERRAIN INTERPRETATION AND NIGHT TERRAIN FLIGHT CW3 Shawn Hayes.
Digital Apollo: Human and Machine on the First Six Lunar Landings (MIT Press, 2007) David A. Mindell Dibner Professor History of Engineering and Manufacturing.
Rotary Wing Night Flight Part II
Kabul RNAV Visual & RNP-AR Process & Benefits
THE VISUAL SYSTEM: PERCEPTUAL PROCESSES
Welcome to the IMC Club Meeting
Front Range Helicopters
Visual Organization and Interpretation
Effects of Darkness on Vision
High Altitude and Terrain Effects
Perception.
Presentation transcript:

1 Human Role in Lunar Landing Charles M. Oman, Ph.D. Director, Man Vehicle Laboratory Massachusetts Institute of Technology Sensorimotor Adaptation Research Team Leader National Space Biomedical Research Institute Go for Lunar Landing: From Terminal Descent to Touchdown Tempe, Arizona, March

2 Human Role in Lunar Landing Technology has improved since Apollo but human brain has not. What is proper allocation of tasks between human and machine ? –Apollo 11 workload was “13 on a scale of 10”. –Crew needs support for autonomous time critical decision making, e.g. landing site re-designation, abort, failure diagnosis. –Should pilot have final control authority, or just a vote ? –Will pilots trust automation ? –Automation surprise: “Why did it do that ?...” –Are modes intuitive ? “What’s it doing now ? What next ?” –Is automation clumsy to re-program ? –Can crew gracefully revert to lower automation levels, or must they revert to full manual?

3 Fully automatic landings to within 10 m of a touchdown point at well surveyed sites will be technically possible. Safety and certification costs considerations dictate “capability for manual control of flight path and attitude” (NASA HRR A:34495) Crew will visually evaluate touchdown area, then approve, re-designate, or fly manually. Manual flight will likely remain the operational baseline. –As on Shuttle: “Train as you fly; fly as you train” –Astronauts are pilots and explorers, not cargo. Crew will have HUDs, cockpit map, profile, perspective terrain displays of 3D DEM terrain data and trajectory to enhance situation awareness and prevent geographic disorientation. Flight path predictors, recommended touchdown zones, and fuel range circles will aid decision making.

4 0.5 m terrain features aren’t visible to the naked eye till 4000 feet away. (5 arc min vs.5 for lidar) Humans have difficulty judging surface slope, smoothness, shape, and size –Shape-from-shading “Crater-Illusion”: craters seem convex, rather than concave when viewed looking down-sun. –Shadow size depends on sun elevation. –Surface slope may be misjudged due to non-Lambertian regolith reflectance, weak gravitational “down” cues. –Crater, boulder, and rock size is ambiguous – they occur in all sizes. –Harder to judge distance due to lack of atmospheric light scattering. Distant objects have unnaturally higher contrast, darker color. –Does Earthshine alone provide sufficient illumination ? –Sensor images may be difficult to interpret.

5 Handling qualities will be superior to Apollo LM. Translations require large vehicle rotations. When visual or vestibular cues are ambiguous, “sensory” PIOs are possible. During final descent: –Crew cannot see terrain directly beneath. Apollo style slow forward translation keeps touchdown point in view. –Dust grayout below ’ reduces visual altitude, attitude and translation cues (e.g. A12, 15; helicopter “brownout” accidents) –Streaming dust can create visual illusion of backward movement. LLTV did not model lunar visual environment. Conclusions: –Early human-in-the-loop simulation is critical for automation development. Reduces subsequent need to “train around” design deficiencies ! –Simulators must accurately model both visual and motion cues. Research needed on e.g. sim dust models, motion washout, sensory PIOs.

6

7 Backups

8 Distance and Slope Judgment

9 Shape from Shading Which “crater” appears concave ? Shape is inferred from shading employing a “light from above” assumption - even in orbital flight. (Oman, 2003)

10 Geographic Disorientation Apollo 15 A15 crew realized they weren’t heading for planned spot, and didn’t know exactly where they were relative to any familiar landmarks. So they picked a smooth area nearby and headed for it. (Mindell, 2007) “The problem was, when we pitched over and began to look out the window, there was nothing there !” “I was very surprised that the general terrain was as smooth and flat as it was..there were very few craters that had any shadow at all, and very little definition”. (Dave Scott)

11 Landing Zone Assessment Apollo goal: touchdown on < 5 deg slope, < 2 ft. variations Perceptual limitations: Cognitive map includes only large landmarks. Fractal terrain, difficult to remember/recognize. 0.5 m landmarks become visible at ~ 4000 feet. Regolith reflectance is not Earthlike (non-Lambertian) Slope difficult to judge at steep visual angles Shading elevation cues are ambiguous. Light from behind/below can make craters appear convex Apollo 15

12 Dust Grayout Grayout at < ’ causes progressive loss of horizon, altitude, position cues. “ I couldn’t tell what was underneath me; I knew it was a generally good area and I was just going to have to bite the bullet and land”, because I couldn’t tell whether there was a crater down there or not”. “It turned out there were more craters there than we realized, either because we didn’t look before the dust started or because the dust obscured them” Pete Conrad, Apollo 12

13

14 Crew cannot see below and behind. Must remain aware of the terrain beneath during descent. Apollo 15 landing gear overlapped edge of a small crater. Descent engine bell damaged by crater rim. Touchdown Terrain Awareness

15 STS-3 unexplained PIO

16 Manual vs. Automatic Constellation Lunar Lander will have autonomous landing capability - needed for uncrewed operations. NASA Spacecraft Human Rating Requirements require capability for manual control of flight path and attitude. Apollo LEMs had autoland capability - though it was never used. Why ? John Young: “Because the place we were landing was saturated in craters and the automatic system didn’t know where the heck the craters were, and I could look out the window and see them. Why trust the automation anyways? You’re responsible for the landing. You know where you want to land when you look out the window and why don’t you make sure you land there?” (Cummings, et al 2005)

17 Semi-Autonomous Option Crew visually confirms site and either: –approves –redesignates, or –flies manually too high for window view too shallow for terrain sensors too far for terrain sensors too far for human eye (Forest et al, 2007; Brady, et al 2007) 7000’ 4000’ Apollo ALHAT method (in development): Lidar scans terrain at 7K’, automation suggests landing spot