UAV CINEMATOGRAPHY CONSTRAINTS IMPOSED BY VISUAL TARGET TRACKING

Slides:



Advertisements
Similar presentations
Vanishing points  .
Advertisements

Elements of Film Basic Film Terms. Shot: a segment of film; an image that begins when the camera is started and ends either when the camera is stopped.
Chayatat Ratanasawanya Min He May 13, Background information The goal Tasks involved in implementation Depth estimation Pitch & yaw correction angle.
Linear Motion Chapters 2 and 3.
MASKS © 2004 Invitation to 3D vision Lecture 7 Step-by-Step Model Buidling.
Cinematography The manipulations of the film strip by the camera in the shooting phase and by the laboratory in the developing phase. –Photographic aspects.
3D M otion D etermination U sing µ IMU A nd V isual T racking 14 May 2010 Centre for Micro and Nano Systems The Chinese University of Hong Kong Supervised.
Mechanics of Machines Dr. Mohammad Kilani
A Multicamera Setup for Generating Stereo Panoramic Video Tzavidas, S., Katsaggelos, A.K. Multimedia, IEEE Transactions on Volume: 7, Issue:5 Publication.
Active Calibration of Cameras: Theory and Implementation Anup Basu Sung Huh CPSC 643 Individual Presentation II March 4 th,
CH24 in Robotics Handbook Presented by Wen Li Ph.D. student Texas A&M University.
Motion Tracking. Image Processing and Computer Vision: 82 Introduction Finding how objects have moved in an image sequence Movement in space Movement.
Motion based Correspondence for Distributed 3D tracking of multiple dim objects Ashok Veeraraghavan.
CS485/685 Computer Vision Prof. George Bebis
COMP322/S2000/L221 Relationship between part, camera, and robot (cont’d) the inverse perspective transformation which is dependent on the focal length.
Camera Calibration CS485/685 Computer Vision Prof. Bebis.
COMP 290 Computer Vision - Spring Motion II - Estimation of Motion field / 3-D construction from motion Yongjik Kim.
Visualization- Determining Depth From Stereo Saurav Basu BITS Pilani 2002.
Optical flow (motion vector) computation Course: Computer Graphics and Image Processing Semester:Fall 2002 Presenter:Nilesh Ghubade
Unit 8 POE Ballistic Device
Automatic Camera Calibration
Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 3.1: 3D Geometry Jürgen Sturm Technische Universität München.
Lecture 11 Stereo Reconstruction I Lecture 11 Stereo Reconstruction I Mata kuliah: T Computer Vision Tahun: 2010.
Cinematography The manipulations of the film strip by the camera in the shooting phase and by the laboratory in the developing phase. What is involved?
Dynamics. Chapter 1 Introduction to Dynamics What is Dynamics? Dynamics is the study of systems in which the motion of the object is changing (accelerating)
Epipolar geometry The fundamental matrix and the tensor
Course 12 Calibration. 1.Introduction In theoretic discussions, we have assumed: Camera is located at the origin of coordinate system of scene.
3D SLAM for Omni-directional Camera
May 9, 2005 Andrew C. Gallagher1 CRV2005 Using Vanishing Points to Correct Camera Rotation Andrew C. Gallagher Eastman Kodak Company
Shape from Stereo  Disparity between two images  Photogrammetry  Finding Corresponding Points Correlation based methods Feature based methods.
Complete Pose Determination for Low Altitude Unmanned Aerial Vehicle Using Stereo Vision Luke K. Wang, Shan-Chih Hsieh, Eden C.-W. Hsueh 1 Fei-Bin Hsaio.
Introduction to the Principles of Aerial Photography
Geometric Camera Models
CSE 185 Introduction to Computer Vision Stereo. Taken at the same time or sequential in time stereo vision structure from motion optical flow Multiple.
3D Imaging Motion.
Raquel A. Romano 1 Scientific Computing Seminar May 12, 2004 Projective Geometry for Computer Vision Projective Geometry for Computer Vision Raquel A.
Sparse Bayesian Learning for Efficient Visual Tracking O. Williams, A. Blake & R. Cipolloa PAMI, Aug Presented by Yuting Qi Machine Learning Reading.
A Flexible New Technique for Camera Calibration Zhengyou Zhang Sung Huh CSPS 643 Individual Presentation 1 February 25,
1 Chapter 2: Geometric Camera Models Objective: Formulate the geometrical relationships between image and scene measurements Scene: a 3-D function, g(x,y,z)
Camera Support Systems Tripod Pedestal Dolly Jib Track Skycam Steadicam Handheld.
 Present by 陳群元.  Introduction  Previous work  Predicting motion patterns  Spatio-temporal transition distribution  Discerning pedestrians  Experimental.
Course14 Dynamic Vision. Biological vision can cope with changing world Moving and changing objects Change illumination Change View-point.
FROM PARTICLE TO RIGID BODY.
1 Systematic Errors in Intro Lab Video Analysis John Zwart, Kayt Frisch, Tim Martin Dordt College, Department of Physics and Astronomy, Sioux Center, IA.
Image-Based Rendering Geometry and light interaction may be difficult and expensive to model –Think of how hard radiosity is –Imagine the complexity of.
Instantaneous Geo-location of Multiple Targets from Monocular Airborne Video.
Signal and Image Processing Lab
Copyright © Cengage Learning. All rights reserved.
Paper – Stephen Se, David Lowe, Jim Little
Motion In Two-Dimensional
Cinematography.
Film Structure and the camera
Computer Vision Lecture 4: Color
Fitting Curve Models to Edges
Range Imaging Through Triangulation
3.3 Working with Equations
Vehicle Segmentation and Tracking from a Low-Angle Off-Axis Camera
Presented by: Cindy Yan EE6358 Computer Vision
Two-view geometry.
Geometric Camera Models
Day 13 – Effects of rigid motion
PRAKASH CHOCKALINGAM, NALIN PRADEEP, AND STAN BIRCHFIELD
Autonomous Vehicle Competition
Cinematography To understand how Directors construct film with the use of camera angles. To appreciate the effect of different camera angles in different.
Course 6 Stereo.
Equations of Motion Higher Unit 1 – Section 1.
EE 492 ENGINEERING PROJECT
Toward Drone Privacy via Regulating Altitude and Payload
Kinematics in Two Dimensions
Translation in Homogeneous Coordinates
Presentation transcript:

UAV CINEMATOGRAPHY CONSTRAINTS IMPOSED BY VISUAL TARGET TRACKING Iason Karakostas, Ioannis Mademlis, Nikos Nikolaidis, Ioannis Pitas Dpt. Artificial Intelligence and Information Analysis Computer Science Aristotle University of Thessaloniki Greece

UAV in Cinematography UAVs have revolutionized aerial cinematography Can replace helicopters, cranes, etc. UAVs support autonomous functionalities based on machine learning and computer vision Autonomous UAVs may visually track and actively follow a specific target of interest We present common UAV target-tracking trajectories and shot types We study the constraints in maximum focal length for successful target tracking 16/10/2019

UAV in Cinematography We standardized and geometrically modeled a number of common, target-following UAV motion types We identified the compatible shot types We analytically determined the maximal permissible camera focal length, so that 2D visual tracking does not get lost, for each UAV motion type. 16/10/2019

UAV/Camera Shot Types The desired shot type is defined by the percentage of the video frame that is covered by the target Region Of Interest (ROI) Shot Type Percentage of ROI Extreme Long Shot (ELS) < 5 % Very Long Shot (VLS) 5 % – 20 % Long Shot (LS) 20 % – 40 % Medium Shot (MS) 40 % – 60 % Medium Close-Up (MCU) 60 % - 75 % Close-Up (CU) > 75 % 16/10/2019

UAV/Camera Motion Types 5 UAV industry- standard camera motion types are detailed and geometrically modeled Lateral Tracking Shot (LTS) Vertical Tracking Shot (VTS) Fly-Over (FLYOVER) Fly-By (FLYBY) Chase/Follow Shot (CHASE) 16/10/2019

UAV/Camera Motion Types LTS VTS FLYOVER FLYBY CHASE 16/10/2019

Constraints On Maximum Focal Length 2D visual tracking algorithms assume that the location of the target ROI center varies no more than a threshold 𝑅 𝑚𝑎𝑥 (in pixels) between successive video frames Focal length 𝑓 affects the permissible shot framing types For UAV cinematography it is important to determine the constraints on maximum focal length imposed by the needs of visual target tracking 16/10/2019

Constraints On Maximum Focal Length In a fully known 3D environment, if the target moves exactly as expected, its next 3D location can been predicted Thus, in this ideal scenario, central composition may always be retained by computing the appropriate LookAt vector at each time instance If target motion deviates from expected, its ROI may be displaced more than 𝑅 𝑚𝑎𝑥 on the next video frame 16/10/2019

Constraints On Maximum Focal Length We examine an entire shooting session as a sequence of repeated transitions between the “first” (𝑡 = 0) and the “second” video frame (𝑡 + 1 = 1) The target ROI center is meant to be fixed at the image center for all video frames, assuming accurate target velocity vector estimation at all times To study 𝑓 𝑚𝑎𝑥 we assume a maximum search radius 𝑅 𝑚𝑎𝑥 (in pixels) within which the ROI in 𝑡 + 1 must lie 16/10/2019

Constraints On Maximum Focal Length Maximum focal length is calculated based on the camera projection equations 𝑥 𝑑 𝑡+1 = 𝑜 𝑥 − 𝑓 𝑠 𝑥 𝑟 1 𝑡 𝑝 𝑡+1 − 𝑥 𝑡+1 𝑟 3 𝑡 𝑝 𝑡+1 − 𝑥 𝑡+1 𝑦 𝑑 𝑡+1 = 𝑜 𝑦 − 𝑓 𝑠 𝑦 𝑟 2 𝑡 𝑝 𝑡+1 − 𝑥 𝑡+1 𝑟 3 𝑡 𝑝 𝑡+1 − 𝑥 𝑡+1 𝑝 𝑡+1 , 𝑥 𝑡+1 : target/UAV expected position 𝑥 𝑑 , 𝑦 𝑑 : target center (pixel coordinates) 𝑜 𝑥 , 𝑜 𝑦 : image center (pixel coordinates) 𝑠 𝑥 , 𝑠 𝑦 : pixel size (mm) 𝑟 1 , 𝑟 2 , 𝑟 3 : rows of the rotation matrix 16/10/2019

Constraints On Maximum Focal Length Rotation Matrix the camera axis points directly at the target the unit vector of the k-axis for the Camera Coordinate System( 𝑟 3 ), can be obtained from 𝑥 𝑡+1 as follows: 𝑟 3 = − 𝑥 𝑡+1 𝑥 𝑡+1 𝑇 and 𝑟 1 ′ = 𝑘 ×− 𝑥 𝑡+1 𝑥 𝑡+1 𝑇 , 𝑟 2 ′ = − 𝑥 𝑡+1 𝑥 𝑡+1 × 𝑘 ×− 𝑥 𝑡+1 𝑥 𝑡+1 𝑇 𝑟 1 = 𝑟 1 ′ 𝑟 1 ′ , 𝑟 2 = 𝑟 1 ′ 𝑟 1 ′ 16/10/2019

Constraints On Maximum Focal Length Using the limit constraint 𝑅 𝑡+1 = 𝑅 𝑚𝑎𝑥 𝑅 𝑚𝑎𝑥 = 𝑥 𝑑 𝑡+1 − 𝑜 𝑥 2 + 𝑦 𝑑 𝑡+1 − 𝑜 𝑦 2 Using projection equations, 𝑓 𝑚𝑎𝑥 is given by: 𝑓 𝑚𝑎𝑥 = 𝑅 𝑚𝑎𝑥 𝑑 𝑡 ′ 𝑠 𝑥 𝑠 𝑦 𝐸 1 +𝐹 𝑥 𝑡 ′ 2 𝑠 𝑥 𝑞 𝑡3 𝑑 𝑡 ′ 2 − 𝑠 𝑥 𝑥 𝑡 ′ 3 𝐸 2 2 + 𝑠 𝑦 2 𝐸 3 2 𝑥 𝑡 ′ 2 where 𝐸 1 =− 𝑞 𝑡1 𝑥 𝑡 ′ 1 − 𝑞 𝑡2 𝑥 𝑡 ′ 2 − 𝑞 𝑡3 𝑥 𝑡 ′ 3 , 𝐸 2 = 𝑞 𝑡1 𝑥 𝑡 ′ 1 + 𝑞 𝑡2 𝑥 𝑡 ′ 2 , 𝐸 3 = 𝑞 𝑡2 𝑥 𝑡 ′ 1 − 𝑞 𝑡1 𝑥 𝑡 ′ 2 . 16/10/2019

Constraints On Maximum Focal Length Experimental cases 8 cases for the deviation vector 𝑞 𝑡 𝑞 𝑡1 =[5, 0, 𝑞 𝑡3 ] 𝑞 𝑡2 =[−5, 0, 𝑞 𝑡3 ] 𝑞 𝑡3 =[0, 5, 𝑞 𝑡3 ] 𝑞 𝑡4 =[0, −5, 𝑞 𝑡3 ] 𝑞 𝑡5 =[5, 5, 𝑞 𝑡3 ] 𝑞 𝑡6 =[−5, −5, 𝑞 𝑡3 ] 𝑞 𝑡7 =[−5, 5, 𝑞 𝑡3 ] 𝑞 𝑡8 =[5, −5, 𝑞 𝑡3 ] 16/10/2019

Constraints On Maximum Focal Length Lateral Tracking Shot UAV position is given by 𝑥 𝑡+1 = 0, 𝑥 𝑡2 ,0 𝑇 Target position is given by 𝑝 𝑡+1 = 𝑞 𝑡1 𝐹 , 𝑞 𝑡2 𝐹 , 𝑞 𝑡3 𝐹 𝑇 , where 𝐹 is the camera framerate Maximum focal length for the LTS is now given by 𝑓 𝑚𝑎𝑥 = 𝑅 𝑚𝑎𝑥 𝑠 𝑥 𝑠 𝑦 𝑞 𝑡2 −𝐹 𝑥 𝑡2 𝑠 𝑦 2 𝑞 𝑡1 2 + 𝑠 𝑥 2 𝑞 𝑡3 2 16/10/2019

Constraints On Maximum Focal Length Variations in target altitude affect all study cases 1 – 8 when 𝑞 𝑡3 =0 the projected ROI center will not change in pixel coordinates, if the target approaches or goes away from the UAV. Due to the position of the UAV, target acceleration and deceleration have identical impact on 𝑓 𝑚𝑎𝑥 Variation of 𝑓𝑚𝑎𝑥 against 𝑞𝑡3 for LTS 16/10/2019

Constraints On Maximum Focal Length Cases 1-2: UAV approaches the target and the maximum focal length decreases, before increasing again as the UAV is flying parallel to the 𝒊-axis. When the drone is positioned far from the target, any change in target velocity corresponds to a small change in the distance between the UAV and the target. Cases 3-4: target deviates from its expected position but remains on the 𝒋-axis 𝑓 𝑚𝑎𝑥 increases with distance between the UAV and the target. 𝑓 𝑚𝑎𝑥 slightly increases when the UAV is very close to the target. Cases 5-8: 𝑓 𝑚𝑎𝑥 depends on the angle between the LookAt vector and the 𝒊-axis: it has lower values when this angle is close to 𝜋 2 (𝑡=10). Variation of 𝑓𝑚𝑎𝑥 against 𝑡 for FLYBY 16/10/2019

Constraints On Maximum Focal Length FLYOVER CHASE VTS 16/10/2019

Conclusions Industry-standard target-tracking UAV/camera motion types have been formalized and geometrically modelled Maximum focal length constraints for computer vision-assisted UAV physical target following have been extracted The derived formulas can be readily employed as low-level rules in intelligent UAV shooting and cinematography planning systems 16/10/2019

Acknowledgement The research leading to these results has received funding from the European Union’s European Union Horizon 2020 research and innovation programme under grant agreement No 731667 (MULTIDRONE). 16/10/2019

Thank you! Q&A