Uses for LWIR and LiDAR Sensor Fusion in Relation to sUAS Detection

Slides:



Advertisements
Similar presentations
anywhere and everywhere. omnipresent A sensor network is an infrastructure comprised of sensing (measuring), computing, and communication elements.
Advertisements

1/6 ELIS – Multimedia Lab Optimization of Automated Video Surveillance Using Multi-modal Video Analysis Viktor Slavkovikj 05/12/2012 Viktor Slavkovikj.
Smart UNattended airborne sensor Network for detection of vessels used for cross border crime and irregular entrY FP7 SUNNY Project UMS 2014 William Martin.
Remote Sensing GIS/Remote Sensing Workshop June 6, 2013.
Verification of specifications and aptitude for short-range applications of the Kinect v2 depth sensor Cecilia Chen, Cornell University Lewis’ Educational.
1 Autonomous Registration of LiDAR Data to Single Aerial Image Takis Kasparis Nicholas S. Shorter
From Imagery to Map: Digital Photogrammetric Technologies 13 th International Scientific and Technical Conference From Imagery to Map: Digital Photogrammetric.
Title: 3DCS Part 1 Data 3/04/2010 Prepared by Morgan Baron, Fellow – GK12 Program Presented at BCTS, Teterboro Campus Supported by: NSF Graduate Teaching.
Watchdog Confident Event Detection in Heterogeneous Sensor Networks Matthew Keally 1, Gang Zhou 1, Guoliang Xing 2 1 College of William and Mary, 2 Michigan.
An artificial neural networks system is used as model to estimate precipitation at 0.25° x 0.25° resolution. Two different networks are being developed,
Wireless Spectral Imaging System for Remote Sensing Mini Senior Design Project Submitted by Hector Erives August 30, 2006.
Smart Traveller with Visual Translator for OCR and Face Recognition LYU0203 FYP.
California Car License Plate Recognition System ZhengHui Hu Advisor: Dr. Kang.
Object Detection Procedure CAMERA SOFTWARE LABVIEW IMAGE PROCESSING ALGORITHMS MOTOR CONTROLLERS TCP/IP
Matt Bretoi FLIR Systems, Inc. Security & Surveillance Solutions PH: (941) Applying Thermal Imaging: Python Detection Meeting.
Drakkar‘s technologies in use: real time 3D cartography, topography & civil engineering works. Mobile mapping with Drakkar‘s PolyScan: methodology,
Track, Trace & Control Solutions © 2010 Microscan Systems, Inc. Machine Vision Tools for Solving Auto ID Applications Part 3 of a 3-part webinar series:
Intelligent Vehicles and Systems Group The Pennsylvania State University 1/9 EDSGN 100 EDSGN 100 Autonomous System Navigation and Driver Augmentation Pramod.
INTUITIONISTIC FUZZY MULTILAYER PERCEPTRON AS A PART OF INTEGRATED SYSTEMS FOR EARLY FOREST-FIRE DETECTION Sotir Sotirov Prof. Asen Zlatarov University,
Gerardo Cabral Jr MIS 304 Professor Fang Fang.  Project Natal” is the code name for a revolutionary new way to play on your Xbox 360.  Natal is pronounced.
Bala Lakshminarayanan AUTOMATIC TARGET RECOGNITION April 1, 2004.
PRESENTED BY: SAURAV SINGH.  INTRODUCTION  WINS SYSTEM ARCHITECTURE  WINS NODE ARCHITECTURE  WINS MICRO SENSORS  DISTRIBUTED SENSOR AT BORDER  WINS.
Automatic Registration of Color Images to 3D Geometry Computer Graphics International 2009 Yunzhen Li and Kok-Lim Low School of Computing National University.
COSC 1P02 Intro. to Computer Science 6.1 Cosc 1P02 Week 6 Lecture slides "To succeed, jump as quickly at opportunities as you do at conclusions." --Benjamin.
Aerial Photographs and Remote Sensing Aerial Photographs For years geographers have used aerial photographs to study the Earth’s surface. In many ways.
Tools in ArcGIS Not only is there an immense toolbox,
Compiled By: Raj G Tiwari.  A pattern is an object, process or event that can be given a name.  A pattern class (or category) is a set of patterns sharing.
Naeem Tull-Walker Grambling State University Mentors: Dr. Demoz, Dr. Sakai Howard University Unmanned Arial System (UAS) Development for Atmospheric Research.
Andrew Ng Feature learning for image classification Kai Yu and Andrew Ng.
UTILIZING HIGH ALTITUDE BALLOON PLATFORMS FOR SUPER RESOLUTION IMAGING JACK LIGHTHOLDER MENTOR: DR. TOM SHARP SPACE GRANT SYMPOSIUM APRIL 18 TH, 2015.
Objective Our goal is to create a radically inexpensive spectrometer for educational purposes using a Raspberry Pi. The prototype is housed in a black.
1 Grenzebach Glier & Associates, Inc. Preliminary Design Review Multispectral Camera Advisor: Prof. Mario Parente Team Parente #13 Simon Belkin Audrey.
February 2007 NASA Dryden Status Aerospace Control & Guidance Sub-committee Boulder, CO February 2007 John Bosworth (661)
Electromagnetic Spectrum. -is the range of all possible frequencies of electromagnetic radiation. The "electromagnetic spectrum" of an object is the characteristic.
Instructor : Dr. Powsiri Klinkhachorn
Mapping the Future of Autonomous Vehicles. What do these autonomous vehicles have in common?
Auto-Park for Social Robots By Team I. Meet the Team Alessandro Pinto ▫ UTRC, Sponsor Dorothy Kirlew ▫ Scrum Master, Software Mohak Bhardwaj ▫ Vision.
© Copyright 2016 Rockwell Collins All rights reserved. Coming to Rail Airspace Close to You - Drones: * An update on the use of drones by the railroad.
ANALYSIS OF AIRBORNE LIDAR DATA FOR ROAD INVENTORY CLAY WOODS 4/25/2016 NIRDOSH GAIRE CEE 6190 YI HE ZHAOCAI LIU.
Strictly confidential Introduction to Lenses 07/28/2015: 2014 Applied Core Technology Inc., All Rights Infrared Vision 05/13/2014: Copyright.
sUAS and High Spatial Resolution Mapping: Prospects and Issues
Docent-robot (Greggg)
TensorFlow CS 5665 F16 practicum Karun Joseph, A Reference:
Border security using Wireless Integrated Network Sensors(WINS)
Nicholas Reich Ethan Plummer R. Alex Kibler
doc.: IEEE <doc#>
Deep Blue Brain Drone Introduction Brain Drone Components Purpose
Aerial Images.
R09560 – Open Architecture, Open Source Aerial Imaging Systems
Electromagnetic Spectrum Project
Creating Robotic Platforms
Randy Rhodes EPRI Technical Executive CIM Users Group November 9, 2017
Factors that Influence the Geometric Detection Pattern of Vehicle-based Licence Plate Recognition Systems Martin Rademeyer Thinus Booysen, Arno Barnard.
Using Tensorflow to Detect Objects in an Image
Artificial Intelligence Changes the Security Landscape
Inline System to Detect Hydrates, Wax and Sand Thomas M. Canty PE J. M
A Comparative Study of Convolutional Neural Network Models with Rosenblatt’s Brain Model Abu Kamruzzaman, Atik Khatri , Milind Ikke, Damiano Mastrandrea,
RGB-D Image for Scene Recognition by Jiaqi Guo
AI Stick Easy to learn and use, accelerate the industrialization of artificial intelligence, and let the public become an expert in AI.
Using Tensorflow to Detect Objects in an Image
Translating High-Throughput Phenotyping into Genetic Gain
An-N National University Telecommunication Department
Development of Unmanned Aerial Systems (UAS) for Planetary Analog Research Connor Williams Dr. Christopher Hamilton (Lunar and Planetary Laboratory) Stephen.
Electromagnetic Waves
Autonomous 3D Mapping for Virtual-Reality Applications
Chapter 6 Exploring Space.
Hayley Gilson, Antony Sanchez, Cris Koutsougeras ET 493 Fall 2018
DIGITAL PHOTOGRAMMETRY
Multi-UAV to UAV Tracking
MOWLES Interim Progress Report
Presentation transcript:

Uses for LWIR and LiDAR Sensor Fusion in Relation to sUAS Detection Jonathan M. Buchholz Iacopo Gentilini

(freeflysystems.com/alta-6/specs) Introduction Rise of small Unmanned Aerial System (sUAS) NASA’s UAS Traffic Management (UTM) Methods require sensing capabilities Detection Location Categorization Long Wave InfraRed (LWIR) Light Detection and Ranging (LiDAR) Sensor fusion (freeflysystems.com/alta-6/specs)

Methodology Performance Perform trade study of available sensors Cost Weight Perform trade study of available sensors Choose processor Raspberry Pi 3 Small Simple Arrange data capture method GPIO trigger for LWIR ROS packages for LiDAR General fusion LiDAR and visible for reference Research sensor fusion methods Geometric transformations Pixel-level correlation Develop sensor fusion algorithm Design sensor mount CATIA 3D printed

Sensors and Specifications (www.VelodyneLiDAR.com) (www.FlirMedia.com) FLIR® VUETM Pro LWIR Camera 7.5 - 13.5 µm wavelength 336x256 resolution 35° HFOV, 27° VFOV 30 FPS Velodyne® PuckTM LITE LiDAR Scanner 903 nm wavelength 360° HFOV, 30° VFOV 100m effective range, ±3cm ~300,000 points per second

Sensor Fusion Algorithm Geometric transformations Pixel-level color correlation

Conclusion Sensor fusion advantages: LWIR contrast LiDAR depth Independent of visible spectrum Machine Learning (ML) Compatibility Fused data for training neural networks Recognition in addition to detection and localization Use on sUAS Lightweight design (Siewert, 2013)

Challenges Mostly software-based: Linux environment ROS architecture BASH scripting Python (www.ROS.org) (www.Python.org)

Future Work Prepare automated data capture Adapt fusion algorithm to RTOS Python & Bash Guaranteed data sync C, C++ conversion Refinement of algorithm Capture data from sUAS platform Compile data for ML functionality ALTA 6 Preliminary flight data Explore DAA methods Publish data sets

The Larger Picture Drone Net ERAU Prescott & CU Boulder Graduate and undergraduate Detection, localization, classification of sUAS in Class G airspace Challenging effectiveness of RADAR Reliability Cost (Siewert, 2018)

Acknowledgements Dr. Iacopo Gentilini Dr. Sam Siewert Dr. Anne Boettcher Dr. Jonathan Gallimore Jim Weber

Please contact: Jonathan M. Buchholz buchholj@my.erau.edu Questions? Please contact: Jonathan M. Buchholz buchholj@my.erau.edu

Thank You