Localization of a Self-Driving Vehicle by Extended Kalman Filtering with Fixed Gains Jungsu Choi, Seokki Song, and Kyoungchul Kong* Department of Mechanical.

Slides:



Advertisements
Similar presentations
TARGET DETECTION AND TRACKING IN A WIRELESS SENSOR NETWORK Clement Kam, William Hodgkiss, Dept. of Electrical and Computer Engineering, University of California,
Advertisements

Chapters 17, 18 Review. Der Innere Schweinehund (The inner Pigdog)
Vector-Valued Functions Copyright © Cengage Learning. All rights reserved.
Two-Dimensional Rotational Dynamics W09D2. Young and Freedman: 1
Chapter 8 Rotational kinematics
1 Brake-by-Steer Concept Challenge the future Delft University of Technology Brake-by-Steer Concept Steer-by-wire application with independently.
Kinematics & Grasping Need to know: Representing mechanism geometry Standard configurations Degrees of freedom Grippers and graspability conditions Goal.
RIGID BODY MOTION: TRANSLATION & ROTATION (Sections )
PLANAR RIGID BODY MOTION: TRANSLATION & ROTATION
PHY205 Ch14: Rotational Kin. and Moment of Inertial 1.Recall main points: Angular Variables Angular Variables and relation to linear quantities Kinetic.
RIGID BODY MOTION: TRANSLATION & ROTATION (Sections )
RIGID BODY MOTION: TRANSLATION & ROTATION
Tyre and Vehicle Model Identification using Identifying Kalman Filters
Attitude Determination - Using GPS. 20/ (MJ)Danish GPS Center2 Table of Contents Definition of Attitude Attitude and GPS Attitude Representations.
KINETICS of PARTICLES Newton’s 2nd Law & The Equation of Motion
PARAMETRIC EQUATIONS AND POLAR COORDINATES 9. Usually, we use Cartesian coordinates, which are directed distances from two perpendicular axes. Here, we.
Discriminative Training of Kalman Filters P. Abbeel, A. Coates, M
Mobile Robotics: 10. Kinematics 1
MAE 242 Dynamics – Section I Dr. Kostas Sierros.
1 CMPUT 412 Motion Control – Wheeled robots Csaba Szepesvári University of Alberta TexPoint fonts used in EMF. Read the TexPoint manual before you delete.
Spinning Out, With Calculus J. Christian Gerdes Associate Professor Mechanical Engineering Department Stanford University.
PLANAR RIGID BODY MOTION: TRANSLATION & ROTATION
Vectors and the Geometry of Space
Bearing and Degrees Of Freedom (DOF). If a farmer goes to milk her cows in the morning carrying a stool under one hand and a pail under another the other.
Two-Dimensional Rotational Dynamics 8.01 W09D2 Young and Freedman: 1.10 (Vector Product), , 10.4, ;
15/09/2015handout 31 Robot Kinematics Logics of presentation: Kinematics: what Coordinate system: way to describe motion Relation between two coordinate.
Adding Vectors, Rules When two vectors are added, the sum is independent of the order of the addition. This is the Commutative Law of Addition.
KINEMATICS OF PARTICLES Kinematics of Particles This lecture introduces Newtonian (or classical) Mechanics. It concentrates on a body that can be considered.
MAE 242 Dynamics – Section I Dr. Kostas Sierros.
Chapter 10 Rotation of a Rigid Object about a Fixed Axis.
Course 12 Calibration. 1.Introduction In theoretic discussions, we have assumed: Camera is located at the origin of coordinate system of scene.
INVERSE KINEMATICS ANALYSIS TRAJECTORY PLANNING FOR A ROBOT ARM Proceedings of th Asian Control Conference Kaohsiung, Taiwan, May 15-18, 2011 Guo-Shing.
Sérgio Ronaldo Barros dos Santos (ITA-Brazil)
Rotational Kinematics
1 Robot Motion and Perception (ch. 5, 6) These two chapters discuss how one obtains the motion model and measurement model mentioned before. They are needed.
DIFFERENTIATION RULES
PHY221 Ch14: Rotational Kin. and Moment of Inertial 1.Recall main points: Angular Variables Angular Variables and relation to linear quantities Kinetic.
Kinematics of Particles
Chapter 10 Rotation.
Chapter 3 Acceleration Lecture 1
Chapter 4 Motion in Two Dimensions. Kinematics in Two Dimensions Will study the vector nature of position, velocity and acceleration in greater detail.
Mobile Robot Navigation Using Fuzzy logic Controller
Inertial Navigation System Overview – Mechanization Equation
RELATIVE MOTION ANALYSIS: VELOCITY
PLANAR KINEMATICS OF A RIGID BODY
Chapter 10 Rotational Motion.
Real-Time Simultaneous Localization and Mapping with a Single Camera (Mono SLAM) Young Ki Baik Computer Vision Lab. Seoul National University.
KINEMATICS OF PARTICLES
Chapter 4 Rotation of rigid body §4.1 The rotation of a rigid body about a fixed axisThe rotation of a rigid body about a fixed axis §4.2 Torque, the law.
3 DIFFERENTIATION RULES. We have:  Seen how to interpret derivatives as slopes and rates of change  Seen how to estimate derivatives of functions given.
City College of New York 1 John (Jizhong) Xiao Department of Electrical Engineering City College of New York Mobile Robot Control G3300:
Robot Velocity Based Path Planning Along Bezier Curve Path Gil Jin Yang, Byoung Wook Choi * Dept. of Electrical and Information Engineering Seoul National.
A Low-Cost and Fail-Safe Inertial Navigation System for Airplanes Robotics 전자공학과 깡돌가
Two-Dimensional Rotational Dynamics W09D2. Young and Freedman: 1
Two-Dimensional Rotational Dynamics 8.01 W09D2 Young and Freedman: 1.10 (Vector Product), , 10.4, ;
Path Planning and Collision Avoidance of a Self-Driving Vehicle Based on Real-time Optimization Kyoungchul Kong* Department of Mechanical Engineering Sogang.
Time-Varying Extended Kalman Filtering for Localization of a Self-Driving Vehicle Jungsu Choi and Kyoungchul Kong* Department of Mechanical Engineering.
Chapter 17 Rigid Body Dynamics. Unconstrained Motion: 3 Equations for x, y, rotation.
Basilio Bona DAUIN – Politecnico di Torino
Improved Lane Detection for Unmanned Ground Vehicle Navigation Seok Beom Kim, Joo Hyun Kim, Bumkyoo Choi, and Jungchul Lee Department of Mechanical Engineering,
Preapared By : Patel Akshay Patel Dharmesh
1 Rotational Kinematics Rotational Motion and Angular Displacement Chapter 8 Lesson 3.
Two-Dimensional Rotational Dynamics 8.01 W09D2
Kinetics of Particles: Newton’s Second Law
RIGID BODY MOTION: TRANSLATION & ROTATION (Sections )
Improved Speed Estimation in Sensorless PM Brushless AC Drives
Path Curvature Sensing Methods for a Car-like Robot
Florian Shkurti, Ioannis Rekleitis, Milena Scaccia and Gregory Dudek
Kalman Filter فيلتر كالمن در سال 1960 توسط R.E.Kalman در مقاله اي تحت عنوان زير معرفي شد. “A new approach to liner filtering & prediction problem” Transactions.
Presentation transcript:

Localization of a Self-Driving Vehicle by Extended Kalman Filtering with Fixed Gains Jungsu Choi, Seokki Song, and Kyoungchul Kong* Department of Mechanical Engineering Sogang University Seoul, Korea (zz 1 zzy, songsk, kckong}Cosogang ac. kr Abstract. In this paper, a sensor fusion method for self-driving vehicles is introduced. The proposed sensor system consists of a GPS, an IMU with three- axes accelerometers, three-axes gyroscopes, and three-axes magnetometers, an encoder at wheels, and a potentiometer that measures the steering angle. For the complete estimation of the absolute position and orientation, the proposed method estimates the driving velocity using a kinematic Kalman filter, and then the orientation angle and absolute position are calculated by the recursive least squares. The proposed method is verified by experimental results in this paper. Keywords: Sensor fusion, Kalman filter, Random process 1 Introduction In the field of automotive engineering, many researchers are focusing on the development of a self-driving technologies. Many vehicle manufacturers such as BMW have actively investigated the self-driving technology and started imple- menting it into commercialized vehicles[1]. For examples, a self-parking system automatically controls the driving and steering actuators for safe parking[2], and an automatic braking system automatically reacts to an emergency situation[3]. Various key technologies are integrated into the self-driving system, e.g., driving motor control, real-time path planning, and so on. For the precise and safe control of the vehicle's location, the localization and motion control of the ve- hicle are of important issues. Most of the localization systems rely on a global positioning system (GPS). The GPS, however, is subjected to an error due to climate conditions, e.g., cloud coverage and humidity, and structural obstacles, e.g., buildings and mountains. An inertial measurement unit (IMU) can be used to compensate the GPS. The IMU contains accelerometers, magnetometers and gyroscopes to measure the object's acceleration and tilt. The IMU, however, is also subjected to a drift problem due to integration. Therefore the vehicle's location should be calculated using a combination of the multiple sensors. * Corresponding author. 188 Computers, NetWorks, Systems, and Industrial Appications

This paper verifies the sensor fusion algorithm presented in [4] with experi- mental result. Driving an electric cart on a road, and obtaining the data from IMU, encoder, potentiometer and GPS, the estimate of the absolute position by the proposed sensor fusion algorithm is verified by comparing the estimate of the absolute position with the true absolute position. 2 Kinematic Analysis of a Vehicle In general, the absolute position and orientation of an object in the three- dimensional space has six degrees of freedom (DOFs) due to the three- dimensional translational movements and three-dimensional rotations. The vehicles, however, can be analyzed by three DOFs, because their motion is constrained on a plane (i.e., the road). Figure la shows the kinematics of a vehicle on a plane. In the figure, ivehic/e represents the distance between the front wheels and rear wheels, v(k) is the travel velocity, a(k) is the frontal acceleration that is perpendicular to the shaft of real wheels, p(k) is the radius of curvature, and 0(k) represents the steering angle. It is assumed that the tire friction is large enough such that the wheels are not slipped in any case. x o (k),h(k) p (k) Vl'afk) x(k +1), y(k +1) x(k),Y(k (k) (a) Two-dimensional (b) Position change of vehicle during plane motion of a one sample period. vehicle. Fig. 1: Kinematic analysis of a vehicle. Let the absolute position (i.e., the latitude and longitude) and orientation of a vehicle are x, y, and 0, respectively. The orientation is defined as the angle of the vehicle with respect to the absolute coordinate system. If the steering angle, 0(k), is not zero, as shown in Fig. la, the vehicle follows a circular trajectory, where the radius of curvature is lvehicle P(" tan 0(k) ( 1 ) Session 2C 189

In the proposed method, the absolute position and orientation (i.e., x, y, and 0) are to be updated (i.e., estimated) every sampling instance. Figure lb shows a case where the vehicle follows a circular trajectory with the constant steering angle and velocity. Given the current position and orientation (i.e., x(k), y(k), and 0(k)) the radius of curvature can also be defined as p(k) = y(k) T. (2) a(k) where v(k) is the travel velocity of the vehicle, and T is the sampling period. a(k) is the change of the orientations at the kth and (k +1)th locations. Namely, a(k) is the difference between 0(k + 1) and 0(k) is, i.e. q5(k + 1) = 0(k) + a(k) Substitution of Eqs. (1) and (2) into (3) yields 0(k + 1) = 0(k) +, T y(k) tan O(k). tv chicle Notice that the orientation can be updated by the current speed and the steering angle, as in Eq. (4). From the geometrical representation shown in Fig. lb, the absolute position at (k + 1)th step can be defined as p(k + 1) where =p(k)+R(k)[ p(k)(1 p(k) sn — i cos (k) a(k))] x(k) p(k) = [ y(k) and R(k) re?sOR)] [sin 0(k) cos 0( (5) (6) and a(k) and 0(k) are as defined in Eqs. (3) and (4). The state space equations in Eqs. (4) and (5) are the foundations of the proposed localization algorithm. 3 Experimental Verification In this section, we verify the proposed sensor fusion algorithm proposed in [4] by experimental result. The experiment model of the vehicle is shown in Fig. 2a. The experiment model is an electric cart powerd by battery. The driving mode of the cart consists of an automatic mode and a manual mode. In the manual mode, the vehicle can be driven by the steering wheel, accelerator pedal and brake, and in the automatic mode, the vehicle can be controlled with LabVIEW and FPGA boards. The distance between front wheels and rear wheels, G e m coe 7 is 2.15m and the radius of wheel, R, is 0.25m. An encoder is installed on the rear wheel, and a potentiometer is installed on shaft of the steering wheel. An IMU is placed 190 Computers, Networks, Systems, and Industrial Appications

Session2C 191

the measured frontal acceleration becomes negative, as shown in Fig. 3a. The proposed sensor fusion algorithm, however, does not consider the slope of the road, and thus additional filter is needed for accurate localization. Figure 3b shows the estimated speed of the cart. The estimated speed by the encoder shows the noise of ±1m/s. On the other hand, the estimated speed by Kalman filter is free from the noise, and it is close to the mean of the estimated speed by the encoder. Therefore, the estimated speed by Kalman filter can be regarded as the true speed of the cart True Orientation Estimated Orientation Time (s) 0.4 Steering angle S Time (s) (a) Steering angle. (b) Estimated of the orientation. Fig. 4: Steering angle and orientation. Figure 4a shows the steering angle of the cart. The steering angle is positive when the front wheels turn to the right hand side. Figure 4b shows the orientation of the cart. The orientation is the angle of the central axis of the cart with respect to absolute longitudinal axis. In Fig. 2b, initial direction of the cart is a south-west, which is about 3.5 radian from longitudinal axis as shown in Fig. 4b. Therefore, the result of Fig. 4b is similar to Figs. 2b and 4a. Also, the true orientation and the estimated orientation show the same trend. 3.2 Estimated Absolute Position Figure 5a shows the absolute position of the cart. Althought the estimated abso- lute position by the proposed sensor fusion algorithm shows a discrepancy from the true trajectory the magnitude of the error is small. This error is because of the GPS sensor and the slope of the road. The accuracy of the GPS sensor used in the experiment was remarkably low, which deteriorated the accuracy of the estimated absolute position significantly. Also, the update gain of the sensor fusion algorithm is in inverse proportion to the accuracy of the GPS. Another reason of the error is the slope of the road. Since the length on the map shown in Fig. 2b is one projected onto the two-dimensional plane, which does not consider 192 Computers, Networks, Systems, and Industrial Appications

Truetrajectory GPS Estimatedabsoluteposition E — — — Distance on the map ____ Estimated distance Time (s) 3040 (a) Vehicle trajectory. (b) Driving distance. Fig. 5: Vehicle trajectory and driving distance. the slope of the road, the estimated distance and the distance on the map show a discrepancy. 4 Summary This paper, verifies the sensor fusion algorithm in [4] by experimental result. The estimated absolute position was similar to the true absolute position. The proposed sensor fusion algorithm is expected to be applicable to self-driving vehicles. Acknowledgments. This research was supported in part by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education, Science and Technology (2012R1A1A ) and in part by Sogang Research Fund( ). References 1.BMWBLOG, bmw-makes-self- drive-car-with-active-cruise-control/ 2.Cabrera-Cosetl, R.; Mora-Alvarez, M.Z.; Alejos-Palomares, R.; Gavrila, D.M.: Self-Parking System Based in a Fuzzy Logic Approach: International Conference on Electrical, Communications, and Computers, CONIELECOMP 2009, pp (2009) 3.Keller, C.G.; Thao Dang; Fritz, H.; Joos, A.; Rabe, C.; Gavrila, D.M.: Active Pedes-trian Safety by Automatic Braking and Evasive Steering: IEEE Transactions on Intelligent Transportation Systems, vol. 12, no. 4, pp (2011) 4.J. Choi.; K. Kong.: Time-Varying Extended Kalman Filtering for Localization of a Self-Driving Vehicle: International Conference on Computer, Networks, Systems. (Submitted) Session 2C 193