UFOAnalyzerV2 (UA2) the key of accuracy

Slides:



Advertisements
Similar presentations
Digital Image Processing
Advertisements

Video cameras and photometry Dave Herald. Background Occultations are usually step events When video introduced, it overcame issues of Personal Equation,
Advanced CCD Workshop Arne A. Henden
QR Code Recognition Based On Image Processing
Efficient access to TIN Regular square grid TIN Efficient access to TIN Let q := (x, y) be a point. We want to estimate an elevation at a point q: 1. should.
Pushing Astrometry to the Limit Richard Berry. Barnard’s Star Location: Ophiuchus Location: Ophiuchus Coordinates: 17 h 57 m º41’36”(J2000) Coordinates:
Image reconstruction and analysis for X-ray computed microtomography Lucia Mancini 1, Francesco Montanari 2, Diego Dreossi 3 1 Elettra - Trieste 2 A.R.P.A.
April / 2010 UFOOrbitV2 1 UFOOrbitV2 (UO2) A Tool for Simultaneous Observers.
With support from: NSF DUE in partnership with: George McLeod Prepared by: Geospatial Technician Education Through Virginia’s Community Colleges.
Motion Tracking. Image Processing and Computer Vision: 82 Introduction Finding how objects have moved in an image sequence Movement in space Movement.
TOPS 2003 Remote Obs 1 Karen Meech Institute for Astronomy TOPS 2003 Image copyright, R. Wainscoat, IfA Image courtesy K. Meech.
Modeling the imaging system Why? If a customer gives you specification of what they wish to see, in what environment the system should perform, you as.
Lecture 2 Photographs and digital mages Friday, 7 January 2011 Reading assignment: Ch 1.5 data acquisition & interpretation Ch 2.1, 2.5 digital imaging.
COMP 290 Computer Vision - Spring Motion II - Estimation of Motion field / 3-D construction from motion Yongjik Kim.
The Calibration Process
GIANT TO DWARF RATIO OF RED-SEQUENCE GALAXY CLUSTERS Abhishesh N Adhikari Mentor-Jim Annis Fermilab IPM / SDSS August 8, 2007.
EE392J Final Project, March 20, Multiple Camera Object Tracking Helmy Eltoukhy and Khaled Salama.
CSC 589 Lecture 22 Image Alignment and least square methods Bei Xiao American University April 13.
Digital Images Chapter 8 Exploring the Digital Domain.
HJ-1A/B CCD IMAGERY Geometric Distortions and Precise Geometric Correction Accuracy Analysis Changmiao Hu, Ping Tang
Processing of exoplanet full field images Farid Karioty CoRoT Week 12/06/2005.
STATUS REPORT OF FPC SPICA Task Force Meeting March 29, 2010 MATSUMOTO, Toshio (SNU)
Computer Science 631 Lecture 7: Colorspace, local operations
Reconstructing 3D mesh from video image sequences supervisor : Mgr. Martin Samuelčik by Martin Bujňák specifications Master thesis
15 October Observational Astronomy Direct imaging Photometry Kitchin pp ,
AST 443/PHY 517 : Observational Techniques November 6, 2007 ASTROMETRY By: Jackie Faherty.
1 COMS 161 Introduction to Computing Title: Digital Images Date: November 12, 2004 Lecture Number: 32.
Recognizing Action at a Distance Alexei A. Efros, Alexander C. Berg, Greg Mori, Jitendra Malik Computer Science Division, UC Berkeley Presented by Pundik.
Aldo Dell'Oro INAF- Observatory of Turin Detailed analysis of the signal from asteroids by GAIA and their size estimation Besançon November 6-7, 2003.
Plenoptic Modeling: An Image-Based Rendering System Leonard McMillan & Gary Bishop SIGGRAPH 1995 presented by Dave Edwards 10/12/2000.
1. These basics are common to ALL cameras: F-Stop Shutter Speed Film Speed 2.
MOS Data Reduction Michael Balogh University of Durham.
Practical applications: CCD spectroscopy Tracing path of 2-d spectrum across detector –Measuring position of spectrum on detector –Fitting a polynomial.
April/ 2010 SonotaCo Network 1 MODWG #03 contents of SonotaCo 17 th April 2010 How is the SonotaCo Network working ? 14:00-14:30 -- Processing data flow.
Night Sky Video Observation Equipments  Video cameraVideo camera  LensLens  A/D converterA/D converter  PC for UFOCapturePC for UFOCapture  Housing/MountHousing/Mount.
April / 2010 UFOCapture 1 UFOCaptureV2 Time-Shift-Motion-Detect-Video-Recording software for complete records of un-expected events.
1. 선다형 문항. Skills for Test In the Editable Mesh, Smoothing Groups define the … 1.Meshing of polygons for rendering optimization. 2.Falloff when using.
Star Catalog Comparison 2016 IOTA Annual Meeting Steve Preston.
April / 2010 UFOAnalyzerV2 1 UFOAnalyzerV2 (UA2) the key of accuracy UA2 inputs video clip files and outputs meteor trajectories. UA2 does following steps.
Chapter 10 Digital Signal and Image Processing
Runcam night eagle Astro edition
UFOCaptureV2 Time-Shift-Motion-Detect-Video-Recording software for complete records of un-expected events April / 2010 UFOCapture.
Computer Graphics: An Introduction
Night Sky Video Observation Equipments
Chapter 7 Process Control.
Winning Strategy in Programming Game Robocode
Week 11 - Wednesday CS361.
COMP 9517 Computer Vision Motion 7/21/2018 COMP 9517 S2, 2012.
The Calibration Process
EE 638: Principles of Digital Color Imaging Systems
Robust Visual Motion Analysis: Piecewise-Smooth Optical Flow
Multidisciplinary Engineering Senior Design Project P06441 See Through Fog Imaging Preliminary Design Review 05/19/06 Project Sponsor: Dr. Rao Team Members:
Chapter 6: Video.
Linear Prediction.
Common Classification Tasks
Computer Vision Lecture 4: Color
meteor research processes Single Station Observation Measurement
Kinematics Projectile Motion
UVIS Calibration Update
Announcements HR Diagram lab will be extended for one week. I’ll talk about it today. Homework: Chapter 9 # 1, 2 & 3 Next week is a Dark Sky Night. If.
Karen Meech Institute for Astronomy TOPS 2003
Announcements No lab this week since we had an observing night Tuesday. Next week: 1st Quarter Nights Tuesday and Thursday. Set-up will start at 6:30pm.
Lecture 2 Photographs and digital mages
Project P06441: See Through Fog Imaging
MODWG #03 contents of SonotaCo
COMS 161 Introduction to Computing
Linear Prediction.
Computer Graphics Lecture 15.
Elementary Mechanics of Fluids Lab # 3 FLOW VISUALIZATION
Observational Astronomy
Presentation transcript:

UFOAnalyzerV2 (UA2) the key of accuracy UA2 inputs video clip files and outputs meteor trajectories. UA2 does following steps for each clip, almost automatically 1) Reference star recognition  magnitude scale, precise direction of camera 2) Pre-event video averaging  Background brightness of each pixel 3) Target recognition on each de-interlaced field 4) Trajectory pole computation and trajectory correction 5) Classification of the object April / 2010 UFOAnalyzerV2

Other UA2 functions - Profile making : FOV plate adjustment with parameter optimizer - Clip file management : view (movie, frame, field), delete, move/copy - Plot : Trail map, Ground map - Utilities : time adjust, video file trimming… Note: Upward compatibility of analysis has been kept since 2004. Anyone can get better result by re-analysis using latest versions. My treasure! Complete 6 years Records of Tokyo1 April / 2010 UFOAnalyzerV2

How UA2 processes video images De-interlace analysis Background subtraction FOV plate constants Center of a object, Light-sum CCD spectral sensitivity compensation Magnitude Trajectory plane computation April / 2010 UFOAnalyzerV2

De-interlace Analysis UA2 does de-interlace analysis. NTSC or PAL video signal contains two images taken in a different timing. It should be analyzed individually. Merit of de-interlaced analysis. Doubles time resolution (samples) -- accuracy increased especially for short meteors (major of obs.). Shape of luminescence becomes simple -- accuracy of center position increased. Note : De-interlace analysis does not reduce the vertical resolution directly. The effect is complex depend on the size and speed of the object. an interlaced frame de-interlaced two fields (deleted lines are interpolated, for display) April / 2010 UFOAnalyzerV2

Background subtraction UA2 does background brightness subtraction on each pixel, using the average brightness of adjacent video frames . ( UFOCaptureV2 records video before the event at least 10 frames. ) The incremental brightness is used for center determination and light-sum computation. April / 2010 UFOAnalyzerV2

FOV plate constants ( compensation of lens distortion ) UA2 uses 4th polynomial equation for the compensation of lens distortion. R = k4*r^4 + k3*r^3 + k2*r^2 + (1.0 – k4 – k3 –k2)*r Where R is corrected distance from optical center on aspect ratio corrected plane. r is catalog based computed distance from optical center. R, r is normalized as 1.0 at the horizontal edge The parameter optimizer decides k2,k3,k4, FOV size, optical center offset, pixel aspect ratio, az, ev, rot at once. April / 2010 UFOAnalyzerV2

Typical position determination accuracy = 0 Typical position determination accuracy = 0.3 pixel, for most of the lenses April / 2010 UFOAnalyzerV2

Comparison with exponential equation April / 2010 UFOAnalyzerV2

Center of the object, Light-sum Procedure for moving object Sets a minimum rectangle that covers the Hit (brightness changed) pixels. Compute the weighted center position of incremental brightness, using background image. Sum of the incremental brightness becomes Light-sum of the object Procedure for stars Compute the background level as average brightness of donut field between R and 1.5R (default R=5 pixel). Compute the weighted center of incremental brightness above the level. Sum of the incremental brightness becomes Light-sum of the star. April / 2010 UFOAnalyzerV2

CCD spectral sensitivity compensation (Cataloged magnitude compensation for the CCD) To avoid the influence of the color of ref stars in FOV, UA2 converts each star’s cataloged visual magnitude (Mv) to color waited magnitude (Mc) according to the CCD spectral sensitivity. Mc = Mv + Bvf * (B-V)  Typical error of star magnitude : 0.5 Mag Bvf = -1.0 , (2V-B) Bvf = 0.0 (V mag) Bvf = 1.0 (B mag) Mc Bvf = 0.28 (WAT-N100) 0 Light-sum of star The best value WAT-100N : 0.28 WAT-902H2U : 0.30 April / 2010 UFOAnalyzerV2

Magnitude For all clips that contains more than 5 stars, p, q are decided by least square method. Mc = p*log( L ) + q : L is the Light-sum of the stars p becomes -2.5 naturally. ( as is the theoretical value). The object’s magnitude is computed by this p, q, and its Light-sum. ( Note: The color of the object itself is not is not compensated. It can be done by spectral camera only. ) April / 2010 UFOAnalyzerV2

Trajectory plane determination UA2 determinates trajectory plane by least square pole determination Note: Begin and End vector should be modified to be precisely on this plane. Pole determination by least square method (pole vector can be got by solving this 3 variable 3 simultaneous equations ) April / 2010 UFOAnalyzerV2

Remaining Problems Correct center determination for meteor with irregular shape. Big fire balls that cause white out of FOV Breakup bodies / afterglow / asymmetrical shape Some part of luminescence covered by clouds or edge of FOV Dashed trajectory by clouds Magnitude measurement without ref stars Where is the body ? How ? April / 2010 UFOAnalyzerV2

Summary / Comment Current UFOAnalyzerV2 uses many new method to achieve the automated hi-accuracy measurement. But still is having many problems. There is possibility that future version will do better for past records. The largest error cause in current video observations is the determination of the center of the object. 0.3 pixel is current limit ( It causes up to a few degrees of pole direction error for short meteors even using 60 degrees FOV lens. From this point, all sky lens never be recommended ). Velocity measurement is specially difficult, differential operation is very sensitive to errors and meteor often has unusual shapes such as tail, afterglow, or explosions. These errors can be compensate by statistical method of post processing. If the measurement of angular velocity becomes precise enough, the radiant estimation from single observation will be possible. April / 2010 UFOAnalyzerV2