April / 2010 UFOAnalyzerV2 1 UFOAnalyzerV2 (UA2) the key of accuracy UA2 inputs video clip files and outputs meteor trajectories. UA2 does following steps.

Slides:



Advertisements
Similar presentations
Digital Image Processing
Advertisements

Video cameras and photometry Dave Herald. Background Occultations are usually step events When video introduced, it overcame issues of Personal Equation,
Advanced CCD Workshop Arne A. Henden
QR Code Recognition Based On Image Processing
Pushing Astrometry to the Limit Richard Berry. Barnard’s Star Location: Ophiuchus Location: Ophiuchus Coordinates: 17 h 57 m º41’36”(J2000) Coordinates:
Image reconstruction and analysis for X-ray computed microtomography Lucia Mancini 1, Francesco Montanari 2, Diego Dreossi 3 1 Elettra - Trieste 2 A.R.P.A.
April / 2010 UFOOrbitV2 1 UFOOrbitV2 (UO2) A Tool for Simultaneous Observers.
With support from: NSF DUE in partnership with: George McLeod Prepared by: Geospatial Technician Education Through Virginia’s Community Colleges.
1 Pixel Interpolation By: Mieng Phu Supervisor: Peter Tischer.
Motion Tracking. Image Processing and Computer Vision: 82 Introduction Finding how objects have moved in an image sequence Movement in space Movement.
TOPS 2003 Remote Obs 1 Karen Meech Institute for Astronomy TOPS 2003 Image copyright, R. Wainscoat, IfA Image courtesy K. Meech.
Modeling the imaging system Why? If a customer gives you specification of what they wish to see, in what environment the system should perform, you as.
COMP 290 Computer Vision - Spring Motion II - Estimation of Motion field / 3-D construction from motion Yongjik Kim.
The Calibration Process
EE392J Final Project, March 20, Multiple Camera Object Tracking Helmy Eltoukhy and Khaled Salama.
…….CT Physics - Continued V.G.WimalasenaPrincipal School of radiography.
CSC 589 Lecture 22 Image Alignment and least square methods Bei Xiao American University April 13.
Digital Images Chapter 8 Exploring the Digital Domain.
Processing of exoplanet full field images Farid Karioty CoRoT Week 12/06/2005.
STATUS REPORT OF FPC SPICA Task Force Meeting March 29, 2010 MATSUMOTO, Toshio (SNU)
Computer Science 631 Lecture 7: Colorspace, local operations
RGB Color Balance with ExCalibrator – “Take 2” SIG Presentation B. Waddington 5/21/2013.
Reconstructing 3D mesh from video image sequences supervisor : Mgr. Martin Samuelčik by Martin Bujňák specifications Master thesis
High-Resolution Interactive Panoramas with MPEG-4 발표자 : 김영백 임베디드시스템연구실.
15 October Observational Astronomy Direct imaging Photometry Kitchin pp ,
AST 443/PHY 517 : Observational Techniques November 6, 2007 ASTROMETRY By: Jackie Faherty.
1 COMS 161 Introduction to Computing Title: Digital Images Date: November 12, 2004 Lecture Number: 32.
Recognizing Action at a Distance Alexei A. Efros, Alexander C. Berg, Greg Mori, Jitendra Malik Computer Science Division, UC Berkeley Presented by Pundik.
Aldo Dell'Oro INAF- Observatory of Turin Detailed analysis of the signal from asteroids by GAIA and their size estimation Besançon November 6-7, 2003.
Plenoptic Modeling: An Image-Based Rendering System Leonard McMillan & Gary Bishop SIGGRAPH 1995 presented by Dave Edwards 10/12/2000.
Action as Space-Time Shapes
1. These basics are common to ALL cameras: F-Stop Shutter Speed Film Speed 2.
MOS Data Reduction Michael Balogh University of Durham.
Practical applications: CCD spectroscopy Tracing path of 2-d spectrum across detector –Measuring position of spectrum on detector –Fitting a polynomial.
April/ 2010 SonotaCo Network 1 MODWG #03 contents of SonotaCo 17 th April 2010 How is the SonotaCo Network working ? 14:00-14:30 -- Processing data flow.
Page 11/28/2016 CSE 40373/60373: Multimedia Systems Quantization  F(u, v) represents a DCT coefficient, Q(u, v) is a “quantization matrix” entry, and.
Night Sky Video Observation Equipments  Video cameraVideo camera  LensLens  A/D converterA/D converter  PC for UFOCapturePC for UFOCapture  Housing/MountHousing/Mount.
April / 2010 UFOCapture 1 UFOCaptureV2 Time-Shift-Motion-Detect-Video-Recording software for complete records of un-expected events.
1. 선다형 문항. Skills for Test In the Editable Mesh, Smoothing Groups define the … 1.Meshing of polygons for rendering optimization. 2.Falloff when using.
Star Catalog Comparison 2016 IOTA Annual Meeting Steve Preston.
Chapter 10 Digital Signal and Image Processing
Runcam night eagle Astro edition
UFOCaptureV2 Time-Shift-Motion-Detect-Video-Recording software for complete records of un-expected events April / 2010 UFOCapture.
Computer Graphics: An Introduction
Night Sky Video Observation Equipments
UFOAnalyzerV2 (UA2) the key of accuracy
Chapter 7 Process Control.
Winning Strategy in Programming Game Robocode
Week 11 - Wednesday CS361.
COMP 9517 Computer Vision Motion 7/21/2018 COMP 9517 S2, 2012.
The Calibration Process
Robust Visual Motion Analysis: Piecewise-Smooth Optical Flow
Multidisciplinary Engineering Senior Design Project P06441 See Through Fog Imaging Preliminary Design Review 05/19/06 Project Sponsor: Dr. Rao Team Members:
Chapter 6: Video.
Linear Prediction.
Computer Vision Lecture 4: Color
meteor research processes Single Station Observation Measurement
Kinematics Projectile Motion
Announcements HR Diagram lab will be extended for one week. I’ll talk about it today. Homework: Chapter 9 # 1, 2 & 3 Next week is a Dark Sky Night. If.
Karen Meech Institute for Astronomy TOPS 2003
Lecture 2 Photographs and digital mages
Project P06441: See Through Fog Imaging
MODWG #03 contents of SonotaCo
COMS 161 Introduction to Computing
Projectile Motion AP Physics C.
Linear Prediction.
Computer Graphics Lecture 15.
Observational Astronomy
Projectile Motion AP Physics C.
Projectile Motion AP Physics C.
Presentation transcript:

April / 2010 UFOAnalyzerV2 1 UFOAnalyzerV2 (UA2) the key of accuracy UA2 inputs video clip files and outputs meteor trajectories. UA2 does following steps for each clip, almost automatically 1) Reference star recognition  magnitude scale, precise direction of camera 2) Pre-event video averaging  Background brightness of each pixel 3) Target recognition on each de-interlaced field 4) Trajectory pole computation and trajectory correction 5) Classification of the object

April / 2010 UFOAnalyzerV2 2 - Profile making : FOV plate adjustment with parameter optimizer - Clip file management : view (movie, frame, field), delete, move/copy - Plot : Trail map, Ground map - Utilities : time adjust, video file trimming … Note: Upward compatibility of analysis has been kept since Anyone can get better result by re-analysis using latest versions. My treasure! Complete 6 years Records of Tokyo1 Other UA2 functions

April / 2010 UFOAnalyzerV2 3 How UA2 processes video images - De-interlace analysis - Background subtraction - FOV plate constants - Center of a object, Light-sum - CCD spectral sensitivity compensation - Magnitude - Trajectory plane computation

April / 2010 UFOAnalyzerV2 4 De-interlace Analysis UA2 does de-interlace analysis. NTSC or PAL video signal contains two images taken in a different timing. It should be analyzed individually. Merit of de-interlaced analysis. Doubles time resolution (samples) -- accuracy increased especially for short meteors (major of obs.). Shape of luminescence becomes simple -- accuracy of center position increased. Note : De-interlace analysis does not reduce the vertical resolution directly. The effect is complex depend on the size and speed of the object. an interlaced frame de-interlaced two fields (deleted lines are interpolated, for display)

April / 2010 UFOAnalyzerV2 5 Background subtraction UA2 does background brightness subtraction on each pixel, using the average brightness of adjacent video frames. ( UFOCaptureV2 records video before the event at least 10 frames. ) The incremental brightness is used for center determination and light-sum computation.

April / 2010 UFOAnalyzerV2 6 FOV plate constants ( compensation of lens distortion ) UA2 uses 4 th polynomial equation for the compensation of lens distortion. R = k4*r^4 + k3*r^3 + k2*r^2 + (1.0 – k4 – k3 – k2)*r Where - R is corrected distance from optical center on aspect ratio corrected plane. - r is catalog based computed distance from optical center. - R, r is normalized as 1.0 at the horizontal edge The parameter optimizer decides k2,k3,k4, FOV size, optical center offset, pixel aspect ratio, az, ev, rot at once.

April / 2010 UFOAnalyzerV2 7 Typical position determination accuracy = 0.3 pixel, for most of the lenses

April / 2010 UFOAnalyzerV2 8 Comparison with exponential equation

April / 2010 UFOAnalyzerV2 9 Center of the object, Light-sum Procedure for moving object 1.Sets a minimum rectangle that covers the Hit (brightness changed) pixels. 2.Compute the weighted center position of incremental brightness, using background image. 3.Sum of the incremental brightness becomes Light-sum of the object Procedure for stars 1.Compute the background level as average brightness of donut field between R and 1.5R (default R=5 pixel). 2.Compute the weighted center of incremental brightness above the level. 3.Sum of the incremental brightness becomes Light-sum of the star.

April / 2010 UFOAnalyzerV2 10 CCD spectral sensitivity compensation (Cataloged magnitude compensation for the CCD) To avoid the influence of the color of ref stars in FOV, UA2 converts each star ’ s cataloged visual magnitude (Mv) to color waited magnitude (Mc) according to the CCD spectral sensitivity. Mc = Mv + Bvf * (B-V)  Typical error of star magnitude : 0.5 Mag 0 Light-sum of star Mc 0 Bvf = -1.0, (2V-B) Bvf = 0.0 (V mag) Bvf = 1.0 (B mag) Bvf = 0.28 (WAT-N100) The best value WAT-100N : 0.28 WAT-902H2U : 0.30

April / 2010 UFOAnalyzerV2 11 For all clips that contains more than 5 stars, p, q are decided by least square method. Mc = p*log( L ) + q : L is the Light-sum of the stars p becomes -2.5 naturally. ( as is the theoretical value). The object ’ s magnitude is computed by this p, q, and its Light-sum. ( Note: The color of the object itself is not is not compensated. It can be done by spectral camera only. ) Magnitude

April / 2010 UFOAnalyzerV2 12 Trajectory plane determination Pole determination by least square method (pole vector can be got by solving this 3 variable 3 simultaneous equations ) UA2 determinates trajectory plane by least square pole determination Note: Begin and End vector should be modified to be precisely on this plane.

April / 2010 UFOAnalyzerV2 13 Remaining Problems Correct center determination for meteor with irregular shape. Big fire balls that cause white out of FOV Breakup bodies / afterglow / asymmetrical shape Some part of luminescence covered by clouds or edge of FOV Dashed trajectory by clouds Magnitude measurement without ref stars Where is the body ? How ?

April / 2010 UFOAnalyzerV2 14 Summary / Comment Current UFOAnalyzerV2 uses many new method to achieve the automated hi- accuracy measurement. But still is having many problems. There is possibility that future version will do better for past records. The largest error cause in current video observations is the determination of the center of the object. 0.3 pixel is current limit ( It causes up to a few degrees of pole direction error for short meteors even using 60 degrees FOV lens. From this point, all sky lens never be recommended ). Velocity measurement is specially difficult, differential operation is very sensitive to errors and meteor often has unusual shapes such as tail, afterglow, or explosions. These errors can be compensate by statistical method of post processing. If the measurement of angular velocity becomes precise enough, the radiant estimation from single observation will be possible.