TAXISAT PROJECT Low Cost GNSS and Computer Vision based data fusion solution for driverless vehicles Marc POLLINA pollina@m3systems.net.

Slides:



Advertisements
Similar presentations
3D Mobile Mapping Dave Henderson Topcon Positioning Systems
Advertisements

For Internal Use Only. © CT T IN EM. All rights reserved. 3D Reconstruction Using Aerial Images A Dense Structure from Motion pipeline Ramakrishna Vedantam.
Decision Making and control for automotive safety Mohammad Ali.
LAPD TELEMATICS PRESENTATION. Why Consider Telematics? 1. Advanced Vehicle Technology 2. Advanced Wireless Communications 3. New Generation of Police.
InteractIVe Summer School, July 6 th, 2012 Grid based SLAM & DATMO Olivier Aycard University of Grenoble 1 (UJF), FRANCE
GIS and Image Processing for Environmental Analysis with Outdoor Mobile Robots School of Electrical & Electronic Engineering Queen’s University Belfast.
Perception and Communications for Vulnerable Road Users safety Pierre Merdrignac Supervisors: Fawzi Nashashibi, Evangeline Pollard, Oyunchimeg Shagdar.
Visual Traffic Simulation Thomas Fotherby. Objective To visualise traffic flow. –Using 2D animated graphics –Using simple models of microscopic traffic.
Recognition of Traffic Lights in Live Video Streams on Mobile Devices
GLOBAL POSITIONING SYSTEM FOR ENVIRONMENTAL MANAGEMENT.
Visual Odometry for Ground Vehicle Applications David Nister, Oleg Naroditsky, James Bergen Sarnoff Corporation, CN5300 Princeton, NJ CPSC 643, Presentation.
Visual Odometry for Vehicles in Urban Environments CS223B Computer Vision, Winter 2008 Team 3: David Hopkins, Christine Paulson, Justin Schauer.
Exploration of Ground Truth from Raw GPS Data National University of Defense Technology & Hong Kong University of Science and Technology Exploration of.
1. The Promise of MEMS to LBS and Navigation Applications Dr. Naser El-Shiemy, CEO Trusted Positioning Inc. 2.
The CarBot Project Group Members: Chikaod Anyikire, Odi Agenmonmen, Robert Booth, Michael Smith, Reavis Somerville ECE 4006 November 29 th 2005.
An INS/GPS Navigation System with MEMS Inertial Sensors for Small Unmanned Aerial Vehicles Masaru Naruoka The University of Tokyo 1.Introduction.
Kalman filter and SLAM problem
Zereik E., Biggio A., Merlo A. and Casalino G. EUCASS 2011 – 4-8 July, St. Petersburg, Russia.
Enabling Location Based Service Deployment for Corporate and Consumer Mobile Applications Presented by Dr. Henry Wong, Chief Operating Officer, SUNDAY.
Autonomous Surface Navigation Platform Michael Baxter Angel Berrocal Brandon Groff.
This action is co-financed by the European Union from the European Regional Development Fund The contents of this poster are the sole responsibility of.
Geopositioning and Applications in Transportation GMAT9205 Students: Ellis Leung ( ) Terry Nham ( )
François PEYRET, LCPC Cooperative systems workshop and product launch 10 December 2008 Berlin Cooperative positioning in CVIS.
GCAPS Team Design Review CPE 450 Section 1 January 21, 2008 Nick Hebner Kooper Frahm Ryan Weiss.
Cooperating AmigoBots Framework and Algorithms
Accuracy Evaluation of Stereo Vision Aided Inertial Navigation for Indoor Environments D. Grießbach, D. Baumbach, A. Börner, S. Zuev German Aerospace Center.
Innovative ITS services thanks to Future Internet technologies ITS World Congress Orlando, SS42, 18 October 2011.
3D SLAM for Omni-directional Camera
Solving the Indoor SLAM Problem for a Low-Cost Robot Using Sensor Data Fusion and Autonomous Feature-Based Exploration PhD Student: Prof. MSc. Luciano.
Sérgio Ronaldo Barros dos Santos (ITA-Brazil)
F Networked Embedded Applications and Technologies Lab Department of Computer Science and Information Engineering National Cheng Kung University, TAIWAN.
European GNSS for ITS: EGNOS contribution ITS WORLD CONGRESS 2011 Orlando– 18 October 2011.
Robust localization algorithms for an autonomous campus tour guide Richard Thrapp Christian Westbrook Devika Subramanian Rice University Presented at ICRA.
Use of GIS Methodology for Online Urban Traffic Monitoring German Aerospace Center Institute of Transport Research M. Hetscher S. Lehmann I. Ernst A. Lippok.
The University of Texas at Austin Vision-Based Pedestrian Detection for Driving Assistance Marco Perez.
PRESENTATION SEMINAR ON GOOGLE DRIVERLESS CAR
GPS Aided INS for Mobile Mapping in Precision Agriculture Khurram Niaz Shaikh Supervised by: Dr. Abdul Rashid bin Mohammad Shariff Dept. of Biological.
1 Airport mobiles surveillance and control by means of EGNOS METIS Workshop Benefits of EGNOS in Civil Aviation Istanbul 18 November 2009 Pierre GRUYER.
Real-Time Simultaneous Localization and Mapping with a Single Camera (Mono SLAM) Young Ki Baik Computer Vision Lab. Seoul National University.
CARLOC: Precisely Tracking Automobile Position
The Cyber-Physical Bike A Step Toward Safer Green Transportation.
COMP 417 – Jan 12 th, 2006 Guest Lecturer: David Meger Topic: Camera Networks for Robot Localization.
GCAPS Team Design Review CPE 450 Section 1 January 22, 2008 Nick Hebner Kooper Frahm Ryan Weiss.
Visual Odometry for Ground Vehicle Applications David Nistér, Oleg Naroditsky, and James Bergen Sarnoff Corporation CN5300 Princeton, New Jersey
Contents: 1. Introduction 2. Gyroscope specifications 3. Drift rate compensation 4. Orientation error correction 5. Results 6. Gyroscope and odometers.
Embedded Systems - the Neural Backbone of Society ARTEMIS Industry Association ARTEMIS, from successful R&D to cutting-edge Innovation Rolf Ernst, TU Braunschweig.
Submission November 2015doc: IEEE /1281r1 Friedbert Berens, FBConsulting SarlSlide 1 Dynamic Environment Use Cases Date: Authors:
CRUISE CONTROL DEVICES Presented by Anju.J.S. CRUISE CONTROL DEVICES.
Decisive Themes, July, JL-1 ARTEMIS Decisive Theme for Integrasys Pedro A. Ruiz Integrasys July, 2011.
Design and Calibration of a Multi-View TOF Sensor Fusion System Young Min Kim, Derek Chan, Christian Theobalt, Sebastian Thrun Stanford University.
Intelligent Transportation System
Emerging Risks & Technologies
Road Safety Behaviour Symposium: New technology, new connectivity
ADVANCED DRIVER ASSISTANCE SYSTEMS
A Virtual Reality and Augmented Reality Technology Company
Paper – Stephen Se, David Lowe, Jim Little
TATVA INSTITUTE OF TECHNOLOGICAL STUDIES, MODASA (GTU)
Pursuit-Evasion Games with UGVs and UAVs
The Importance of ADAS Technology
Florian Shkurti, Ioannis Rekleitis, Milena Scaccia and Gregory Dudek
Anti-Collision Sensor Market
GNSS opportunities in Road Transportation
Vehicle Segmentation and Tracking in the Presence of Occlusions
Interior Camera - A solution to Driver Monitoring Status
Mobile World Congress Americas| 13 September 2018
Self Driving Car Market Self Driving Car Market.
Self Driving Car Market Self Driving Car Market.
Sensor Fusion Localization and Navigation for Visually Impaired People
Positioning Use Cases for NGV
Presentation transcript:

TAXISAT PROJECT Low Cost GNSS and Computer Vision based data fusion solution for driverless vehicles Marc POLLINA pollina@m3systems.net

Outline Importance of ITS In-vehicle systems: Future Technologies System Architecture Results Analysis Conclusions

Importance of ITS The Global Market for ITS Technologies is estimated to grow to €50BN by 2020. Automotive Industry is one of the most innovative sectors Active: Continuously monitor an aspect of the user, vehicle, environment or transport network and alert the user to potential danger, or intervene with the driving task to avoid danger Passive: These are crash mitigation or minimisation technologies that act to enhance the safety of the driver or other road users by minimising the severity. Combined active and passive systems (CAPS): These systems monitor the environment, vehicle or driver for potential danger and then apply passive safety measures if a crash is deemed unavoidable

Example of test case ( GUIDE Laboratory – Toulouse) GNSS Sensor in Urban Area Example of test case ( GUIDE Laboratory – Toulouse) Blue : GNSS , Green : reference ( PPK + high grade IMU)

Example of test case ( GUIDE Laboratory – Toulouse) GNSS Sensor in Urban Area Example of test case ( GUIDE Laboratory – Toulouse) Blue : GNSS , Green : reference ( PPK + high grade IMU)

Future Technologies Sensor Fusion is essential : no sole positioning sensor covers all requirements and constraints Combination of computer vision, 3D Maps and GNSS technologies are fostering new solutions not only for driving assistance but for unmanned vehicles Historically, “integrated navigation” has typically meant the combination of two systems, such as GNSS and inertial navigation, or occasionally three, such as GNSS, odometry, and map matching. However, future integrated navigation systems are likely to have many more components.

Future Technologies GNSS : new constellations & new frequencies New GNSS satellite constellations, signals, and associated frequency diversity is stimulating innovations in user equipment design leading to improved capabilities of positioning 3D Maps : city mapping 3D city mapping has the potential to revolutionize positioning in challenging urban areas. Adding height information to street maps can be used to aid GNSS positioning for land vehicle and pedestrian navigation. Computer vision: intelligent camera The major new navigation sensor of the next decade could well be the camera. Visual odometry, is a form of dead reckoning Position GNSS Others … Computer Vision 3D maps

Architecture

Cost/Accuracy Trade off Architecture Traditional Sensors Cost/Accuracy Trade off Odometers for: Wheels speed Front Axle orientation Gyro: Optical MEMS

Cost/Accuracy Trade off Architecture Position Sensors Cost/Accuracy Trade off Trimble bullet III: compact antenna - Low cost and good gain LEA-6T : GPS/EGNOS receiver - Accurate, reliable

Cost/Accuracy Trade off Architecture Computer Vision Cost/Accuracy Trade off FLEA 3, Point grey, stereo pair SLAM Enhancing performance level compared to usual INS Transversal displacements and estimations of  velocity and orientation Matching between a live map of the scene structure and a new acquired image FOLLOW THE LANE Improve security, reliability and 24/7 operation possibility Extra feature derived from ADAS to assist continuously the car’s control loops

EDAS Connection Module Architecture EDAS Connection Module Local server - Hosting the EDAS client software (EDAS server connection software) - Filtering routine 3G communication - Communication between the local server and the vehicle

Architecture Tight Hybridization module composed of An Inertial Navigation System (INS) which integrates the gyrometer/odometer data (100Hz) A Navigation filter which updates and corrects the INS according to the measurements from the Vision or GNSS modules when available and valid 3 platforms -> Time synchronisation of measurement required

Mapping of real world information to 2D image Real Time Scenario GeoPositioning No Geo-Referenced information A-priori Unknown scenario Real World Information Ratio Real Distance / Location (lat,lon) Measured Information: - GNSS Position Device - Orientation by Sensors Mapping of real world information to 2D image Camera/Vehicle position and Orientation in Real Time Captured image . Captured image . Information in pixels (x4,y4) - (lat4,lon4) (x3,y3) - (lat3,lon3) Known relation Depth Information Future GIS Hibridization Capabilities Precise Map Building Usable information for control loops: predictive (x1,y1) - (lat1,lon1) (x2,y2) - (lat2,lon2) Measured Reference (x0,y0) - (lat0,lon0)

Vision Sensor: FtL results Follow the Lane Tx: (lateral) translation in x Vx: linear velocity in x Wx: width of the lane dWx: linear velocity of the change of width Self Assesment Active Control of Light Conditions

Vision Sensor: SLAM SLAM (Simultaneous Location & Mapping ) : Visual odometry + Mapping Visual odometry: Estimation of the EgoMotion (6D camera/vehicle pose) in real time Real time 3D scene map generation

Evaluation FtL Evaluation SLAM Module - 2 step evaluation Recorded video sequences: 337 minutes SLAM Module - 2 step evaluation Laboratory computer using the KITTI odometry evaluation dataset with ground truth 22 sequences of images recorded with a stereo pair of cameras embedded in a car. Evaluation in San Sebastian Running predifined paths

Evaluation Accuracy and precision of the odometry Translation error max 0.29% rotational error 0.0122 deg/m Runtime 9.0 ms

Conclusions Computer Vision as one key sensor for enabling autonomous driving Enable autonomous or semi-autonomous driving of your vehicle even in situations when GNSS Signal is unreliable or not available at all (i.e. indoors, in tunnels, under dense vegetation, etc.). Know the position of your vehicle even when no GNSS reception is available. Improve position precision and reliability considerably when compared to GNSS-only solutions Improve availability compared to GNSS solutions. SLAM is possible 24/7 while GNSS reception might be unreliable or not available at all for several minutes Create a map in Real Time and Geo-locate all the point of an image in Real Time

THANK YOU! Marc POLLINA pollina@m3systems.net