Download presentation
Presentation is loading. Please wait.
1
Vision based automated steering
Model based automated steering control theoretic formulation Vision for car following Motion and stereo cues tracking vehicle ahead throttle and brake control
2
Vision as a sensor Both in human and artificial vision systems
measurements at the look-ahead presence of the delay in vision processing large field of view computational issues
3
Related Work Driving applications Prometheus project, Dickmanns
Alvinn-RALPH, CMU (neural network based) Raviv , Herman NIST (kinematic setting) PATH project, Ohio State
4
Control and Vision Previously Effect of delays were not considered
Now Control explicit model of delay model of vehicle dynamics model of curvature disturbance tracking performance depending on the curvature of the road Vision Monocular vision for steering - model based Stereo and motion cues for car following - model free Previously Effect of delays were not considered low speed control laws are not easily transferable to high speeds monocular vision cues neural network-based approaches
5
Car dynamics y a vy Ff Fr b d vx x lr lf Dynamic model of the vehicle
vy- lateral velocity y - yaw rate .
6
Vision for lateral control
d L yL Rref e y Vision dynamics yl offset at the look-ahead e angle at the look-ahead
7
Control design motivation
Human driving performance lane keeping with small errors (in heavy traffic) road following human reaction time (1Hz) vision delay 0.73s preview information for high speeds (Land 96) Rref Control performance specification desirable tracking error ( 0.1m) extreme situations - maximal allowable error (0.4m) steady state lateral acceleration (0.2g) passenger comfort (bandwidth limits Hz) sensor noise considerations robustness with respect to the variety of road conditions and car parameters (mass, frictions, cornering stiffness)
8
Vision in lateral control loop
road curvature y(t) y’(t) Vehicle 1/s 2 Vision System Delay Pose estimation Controller control and measurements at the look-ahead presence of the delay in vision processing performance specification - tracking error, maximum error, passenger comfort
9
Controller and Observer
Controller design - look-ahead (10m) guarantees stability in the presence of 30ms delay - design for highest intended speed - velocity scheduling - lead-lag feed-back control law - state-feedback, feed-back linearization Observer design - estimation of the curvature of the road - adaptation of the passenger comfort - state augmented with curvature - outputs from different sensors with different sampling rates - yaw rate gyro
10
Lane change Feed-forward control law Lane change maneuver
tracking changes in curvatures improved transient behavior in changes between curves road curvature FF Vis C(s) V(s) Vis Lane change maneuver open-loop design (parametrized by speed) satisfies constraints on lateral acceleration
11
Experiments NAHSC Demo’97
Mojave dessert (up to 100 mph, 1200 m road curvature) Lateral acceleration Offset from center Offset from center with ff Velocity Curvature estimates Lateral acc. with ff
12
Car following Planar scene 3D affine reconstruction Angle Distance
(with P. McLaughlan) Planar scene 1. Affine reconstruction 2. Motion computation 3. Triangulation 3D affine reconstruction Angle Distance
13
Scenario Vision sensing Steering actuator Throttle actuator Obstacle
detection Lane detection Lane following Velocity tracking Lane change Lane change Car detection Lane following Car-ahead following
14
Results Large look-ahead guarantees stability in the presence of delays lead-lag feedback control law satisfy the passenger comfort constraints feed-forward control law for lane change maneuvers tracking high curvature curves NAHSC DEMO’97 (1200 rides)
15
Direct Visual Servoing
process data in the sensor frame control in the sensor frame Vision System e(t) u(t) x(t) y(t) Controller Agent Vision System Pose estimation Look-and-move transform to the scene/agent frame control in the scene/agent frame
16
Direct Visual Servoing
process data in the sensor frame control in the sensor frame Vision System e(t) u(t) x(t) y(t) Controller Agent Vision System Look-and-move transform to the scene/agent frame control in the scene/agent frame
17
Sensing and Control hierarchies
Vision and control robust elementary strategies supported by solid control theoretic analysis situation assessment/warning systems image-based control techniques in unstructured environments map-building Modeling and analysis of complex systems (probabilistic) analysis of hybrid system safety and reachability properties multi-agent coordination decision-making under uncertainty (learning) Programming languages simulation environments
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.