Introduction to Tracking

Slides:



Advertisements
Similar presentations
Team:. Prepared By: Menna Hamza Mohamed Mohamed Hesham Fadl Mona Abdel Mageed El-Koussy Yasmine Shaker Abdel Hameed Supervised By: Dr. Magda Fayek.
Advertisements

Virtual Me. Motion Capture The process of recording movement and translating that movement onto a digital model Originally used for military tracking.
Motion Capture The process of recording movement and translating that movement onto a digital model Games Fast Animation Movies Bio Medical Analysis VR.
Virtual Me. Motion Capture (mocap) Motion capture is the process of simulating actual movement in a computer generated environment The capture subject.
Prepared By: Menna Hamza Mohamed Mohamed Hesham Fadl Mona Abdel Mageed El-Koussy Yasmine Shaker Abdel Hameed Supervised By: Dr. Magda Fayek.
VR graphics.ssu.ac. kr 1 Ultrasonic Trackers Definition: A non-contact position measurement device that uses an ultrasonic signal produced by a stationary.
What is Virtual Reality? “A high-end user interface that involves real-time simulation and interaction through multiple sensorial channels.” (vision, sound,
Input devices and interaction Ruth Aylett. Contents n Tracking –What is available n Devices –Gloves, 6 DOF mouse, WiiMote.
Spatiotemporal Information Processing No.2 3 components of Virtual Reality-1 Sensing System Kazuhiko HAMAMOTO Dept. of Information Media Technology, School.
Input Devices: Trackers, Navigation and Gesture Interfaces
Gesture Controlled Car (GCC) By: Ashwaq Alkailany Reema Abubaker Supervised by: Dr. Luia Malhis.
1 Sixth Lecture Types of Transducers and Their Applications Instrumentation and Product Testing.
ECE 4321: Computer Networks Chapter 3 Data Transmission.
Tracking Systems Cesar Martinez Internetworked Virtual Reality COMP6461 September 2002 INPUT DEVICES.
Optical Tracking. How this pertains to our project Distributed Instrument Control with TINI using CORBA Distributed Computer Ethernet TINI RS-232 Polaris.
Jan 91 Tracking Sherman & Craig, pp Sherman & Craig, pp Welch, Greg and Eric Foxlin (2002). “Motion Tracking: No Silver Bullet, but a Respectable.
Chapter 3 Data and Signals
Sonitus Capture the imagination. Agenda Introduction Introduction System Overview System Overview Transmit stage Transmit stage Receive stage Receive.
1 Ceng Tracking Gökhan Tekkaya Gürkan Vural Can Eroğul METU, 2008.
Location Systems for Ubiquitous Computing Jeffrey Hightower and Gaetano Borriello.
Tracking Gökhan Tekkaya Gürkan Vural Can Eroğul. Outline Tracking –Overview –Head Tracking –Eye Tracking –Finger/Hand Tracking Demos.
1 Communication through head movements Juha Pieviläinen Alternative communication & access to information seminar 2003 Department.
William Stallings Data and Computer Communications 7th Edition (Selected slides used for lectures at Bina Nusantara University) Data, Signal.
Theoretical Foundations of Multimedia Chapter 3 Virtual Reality Devices Non interactive Slow image update rate Simple image Nonengaging content and presentation.
1 Advanced Sensors Lecture 6 Sensors Technology AUE 2008 Bo Rohde Pedersen.
Motion Capture in 3D Animation Edward Tse. Motion Capture as a Tool Motion capture (MOCAP) is an effective 3D animation tool for realistically capturing.
Computer Animation Rick Parent Computer Animation Algorithms and Techniques Motion Capture.
Metra Mess- und Frequenztechnik Radebeul / Germany Piezoelectric Accelerometers Theory & Application.
VR Introduction (for web3d) Jyun-Ming Chen Fall 2001.
Smartphone Overview iPhone 4 By Anthony Poland 6 Nov 2014.
Motion Capture Hardware
Tracking Overview and Mathematics. Christoph Krautz 2 Motivation Technologies – Advantages and Disadvantages –Common Problems and Errors –Acoustic Tracking.
Introduction to Data communication
1 Business Telecommunications Data and Computer Communications Chapter 3 Data Transmission.
M. Zareinejad 1. 2 Outline # Sensors –––– Sensor types Sensor examples #Actuators Actuator types Actuator examples ––––
Slide # 1 Velocity sensor Specifications for electromagnetic velocity sensor Velocity sensors can utilize the same principles of displacement sensor, and.
CE 4228 Data Communications and Networking
Sensing self motion Key points: Why robots need self-sensing Sensors for proprioception in biological systems in robot systems Position sensing Velocity.
VE Input Devices(I) Doug Bowman Virginia Tech Edited by Chang Song.
1 Lecture 19: Motion Capture. 2 Techniques Morphing Motion Capture.
An Introduction to Robotic Navigation ECE 450 Introduction to Robotics.
Designing 3D Interfaces Examples of 3D interfaces Pros and cons of 3D interfaces Overview of 3D software and hardware Four key design issues: system performance,
Prepared By: Menna Hamza Mohamed Mohamed Hesham Fadl Mona Abdel Mageed El-Koussy Yasmine Shaker Abdel Hameed Supervised By: Dr. Magda Fayek.
Virtual Imaging Peripheral for Enhanced Reality Aaron Garrett, Ryan Hannah, Justin Huffaker, Brendon McCool.
Muscle Volume Analysis 3D reconstruction allows for accurate volume calculation Provides methods for monitoring disease progression Measure muscle atrophy.
Sérgio Ronaldo Barros dos Santos (ITA-Brazil)
Josh Stephens Comp Characteristics Degrees of Freedom: particular, independent way that a body moves in space Input type/Frequency of data: Discrete:
Echo Cancellation Chapter 4. Echo : Echo is the repetition of a signal back to the transmitter; either due to a coupling between the loudspeaker and microphone.
VIRTUAL REALITY (VR) INTRODUCTION AND BASIC APPLICATIONS الواقع الافتراضي : مقدمة وتطبيقات Dr. Naji Shukri Alzaza Assist. Prof. of Mobile technology Dean.
INS: Inertial Navigation Systems An overview of 4 sensors.
Inertial Navigation System Overview – Mechanization Equation
HCI 입문 Graphics Korea University HCI System 2005 년 2 학기 김 창 헌.
Salim Modi, David Nguyen, Mitul Patel Virtual Environments Tracking Systems.
User Performance in Relation to 3D Input Device Design  Studies conducted at University of Toronto  Usability review of 6 degree of freedom (DOF) input.
Tracking Systems in VR.
FUNDAMENTALS OF NETWORKING
Tracking Systems in VR.
M. Zareinejad 1. 2 Outline # Sensors –––– Sensor types Sensor examples #Actuators Actuator types Actuator examples ––––
© Houari Abdallahi, James Lawton, Deniz Ozsen, Christine Dubreu Virtual Environments: Tracking.
-BY SAMPATH SAGAR( ) ABHISHEK ANAND( )
Basic Theory of Motion Capture By: Vincent Verner.
Fundamentals of Computer Animation Motion Synthesis.
Introduction to Tracking
Introduction to electronic communication systems
CAPTURING OF MOVEMENT DURING MUSIC PERFORMANCE
Autonomous Cyber-Physical Systems: Sensing
Day 32 Range Sensor Models 11/13/2018.
Distance Sensor Models
VR equipment.
3D User Interface Input Hardware
Presentation transcript:

Introduction to Tracking Sherman & Craig, pp. 75-94. Welch, Greg and Eric Foxlin (2002). “Motion Tracking: No Silver Bullet, but a Respectable Arsenal,” IEEE Computer Graphics and Applications, special issue on “Tracking,” November/December 2002, 22(6): 24–38.. (http://www.cs.unc.edu/~tracker/media/pdf/cga02_welch_tracking.pdf) Jan 9

Motivation We want to use the human body as an input device more natural this will lead to higher level of immersion to control navigation head hand to control interaction body We need two things for this: Signaling (button presses, etc.) Location. <- this is tracking! Jan 9

Tracking Position Examples Location Orientation Pose Head position Hand position (pose) Other body parts (e.g., self-avatars) Other objects that also have physical representations (spider). Jan 9

Receiver coordinate system Origin for tracker coordinate system Basic Idea Trackers provide location and/or position information relative to some coordinate system. What info would we need? (x,y,z) (rx,ry,rz) X Y Z (0,0,0) Receiver coordinate system (0,0,0) Origin for tracker coordinate system Jan 9

Degrees of freedom The amount of pose information returned by the tracker Position (3 degrees) Orientation (3 degrees) There are trackers that can do: only position only orientation both position and orientation Jan 9

Question Okay, given that I want to track your head, I attach a new tracker from NewTracker Corp. it returns 6 degrees of freedom (6 floats). What questions should you have? In other words, what are some evaluation points for a tracking system? 5 minutes to discuss Jan 9

Evaluation Criteria Data returned (3 dof, 6 dof, >6 dof) Spatial distortion (accuracy) (sub mm) Resolution (sub mm) Jitter (precision) (sub mm) Drift Lag (1 ms) Update Rate (2000 Hz) Range (40’x40’ – GPS) Interference and noise Mass, Inertia and Encumbrance Multiple Tracked Points (1-4, 128) Durability (self-contained?) Wireless (yes) Price ($1800 3dof - $40,000+, $180k+ mocap) Which of these are most important? Jan 9

Data returned Spatial distortion (accuracy) Resolution Jitter (precision) Drift Lag Update Rate Range Performance Measures Reportable location and orientation based on resolution Jitter Drift Registration Actual Object Position Jan 9

Performance Measures Registration (Accuracy) – Represents the difference between an object’s actual 3D position and the position reported by the tracker Location Orientation Resolution – Fineness with which the tracking system can distinguish individual points or orientations in space. Jitter – Change in reported position of a stationary object. Drift – Steady increase in tracker error with time. Jan 9

Performance Measures Lag (Phase Lag) – Difference between when a sensor first arrives at a point and when the tracking system first reports that the sensor is at that point. Sometimes called latency. Latency: The rate (or time delay) at which the acquisition portion of the system can acquire new data. Transmission Lag: Time needed to send bits of information that define position to the computer or graphics engine. Jan 9

Update Rate Number of tracker position/orientation samples per second that are transmitted to the receiving computer. Fast update rate is not the same thing as accurate position information. Poor use of update information may result in more inaccuracy. Upper bound is determined by the communications rate between tracker and computer and the number of bits it takes to encode position and orientation. Jan 9

Range Position range or working volume Sphere (or hemisphere) around the transmitter. Accuracy decreases with distance Position range is inversely related to accuracy. Orientation Range – set of sensor orientations that the tracking system can report with a given resolution. Jan 9

Interference and Noise Interference is the action of some external phenomenon on the tracking system that causes the system’s performance to degrade in some way. Noise – random variation in an otherwise constant reading. (Static position resolution) Inaccuracies due to environmental objects. Jan 9

Mass, Inertia and Encumbrance Do you really want to wear this? Things with no weight on your head can have inertia. Tethered Jan 9

Multiple Tracked Points Ability to track multiple sensors within the same working volume. Interference between the sensors Multiplexing Time Multiplexing – Update rate of S samples per second and N sensors results in S/N samples per sensor per second Frequency Multiplexing – Each sensor broadcasts on a different frequency. More $$ Jan 9

Price You get what you pay for. Rich people are a small market. Jan 9

Body Tracking Technology Position Tracking Orthogonal Electromagnetic Fields Measurement of Mechanical Linkages Ultrasonic Signals Inertial Tracking Optical Tracking Inside Looking Out (Videometric) Outside Looking In Angle Measurement Optical Sensors Strain Sensors Exoskeletal Structures Jan 9

Electromagnetic Trackers Use the attenuation of oriented electromagnetic signals to determine the absolute position and orientation of a tracker relative to a source. Polhemus (a.c.) Ascension (d.c.) Jan 9

Basic Principles of EM Trackers Source contains 3 orthogonal coils that are pulsed in rotation, one after another. Each pulse transmits a radio frequency electromagnetic signal that is detected by a sensor. The sensor also contains 3 orthogonal coils, which measure the strength of the signal from the current source coil (9 total measurements) By using the known pulse strength at the source and the known attenuation of the strength with distance, these nine values can be used to calculate position and orientation of the sensor coils. Jan 9

Basic EM Principles (cont.) Source and sensor are connected to a box which contains a microcomputer and electronics associated with the pulses. Serial communications (serial port) A source may be associated with 1 to as many as 18 sensors Problems: Earth’s Magnetism! Jan 9

Characteristics of EM Trackers Measure position and orientation in 3D space Do not require direct line of sight between the source and the sensor Accuracy affected by DC: Ferrous metal and electromagnetic fields. AC: Metal and electromagnetic fields Operate on only one side of the source (the working hemisphere). Working distance of about 3-25? feet from source. (Depends on source size, power) Jan 9

Output of EM Trackers Polhemus (AC) Ascension (DC) Position: 3 Integers Orientation: Euler angles,Directional Cosines, Quaternions Ascension (DC) Orientation: Euler angles, 3x3 Rotation Matrices Jan 9

Technology Electromagnetic Transducers Limited range/resolution Ascension Flock of Birds, etc Polhemus Fastrak, etc Limited range/resolution Tethered (cables to box) Metal in environment No identification problem 6DOF Realtime 30-144 Hz 13-18 sensors Jan 9

Example Now all USB 6 bytes for position (3 two-byte integers) 18 bytes for orientation (9 two-byte integers of a 3x3 orientation matrix). 3 byte header 8 data bits and 1 stop bit, no start or parity bits (9 bits/byte) Total per data packet: 27*9 = 243 bits 19,200 baud 13 millisecond transmission time 79 packets/second Now all USB Jan 9

Lag between actual and rendered position Time to acquire and compute position and orientation Transmission time (0.013 seconds for example for one sensor). Graphics Frame rate (10-60 frames/sec) Jan 9

Mechanical Linkage Jointed structure that is rigid except at the joints. One end (base) is fixed. The other (free, distal) end may be moved to an arbitrary position and orientation. Sensors at the joints, detect the angle of the joints. Concatenation of translates and rotates can be used to determine the position and orientation of the distal end relative to the base. Jan 9

Characteristics of ML Fast Accurate: Encumbered Movement Expensive Depends on the physical size of the ML Depends on quality of rotation sensors at joints Encumbered Movement Expensive Can incorporate force feedback (PHANToM) Used on the BOOM display system from Fake Space Labs Jan 9

Sensible Technolgies Phantom Jan 9

Ultrasonic Tracking Use the time-of-flight of an ultrasonic sound pulse from an emitter to a receiver. Either the emitter or the receiver can be fixed, with the other free to move. Logitec Mattel Power Glove A component of Intersense Inertial + Ultrasonic systems Jan 9

Basic Principles of UT Based on measurement of time-of-flight of a sound signal. 1000 feet/Sec Source component contains transmitters that produce a short burst of sound at a fixed ultrasonic frequency. The sensor component contains microphones that are tuned to the frequency of the sources. Jan 9

UT Characteristics Inexpensive (Used in Mattell Powerglove $100). Inaccurate. Echoes and other ambient noise Require a clear line-of-sight between the emitter and the receiver. Sometimes used for head-tracking for CRT displays. Jan 9

Basic UT Setup Stationary Origin (receivers) Tracker (transmitters) distance1 distance2 distance3 Jan 9

UT Position and Orientation Information 1 transmitter, 3 receivers : 3D position relative to fixed origin 2 transmitters, 3 receivers : 3D position and orientation up to a roll around a line through the two transmitters 3 transmitters, 3 receivers : complete position and orientation information Jan 9

Inertial Tracking Uses electromechanical devices to detect the relative motion of sensors by measuring change in: Acceleration Gyroscopic forces Inclination Jan 9

Accelerometers Mounted on a body part to detect acceleration of that body part. Acceleration is integrated to find the velocity which is then integrated to find position. Unencumbered and large area tracking possible Jan 9

Accelerometer Tracking Errors Suppose the acceleration is measured with a constant error i, so that measured acceleration is ai(t)+ I vi(t) = (ai(t)+ i)dt =  ai(t)dt + it xi(t) =  vi(t)dt = ( ai(t)dt + t)dt xi(t) =  ai(t)dtdt + 1/2 it2 Errors accumulate since each position is measured relative to the last position Jan 9

Inertial Tracking Inclinometer – measures inclination relative to some “level” position Gyroscopes Jan 9

Optical Trackers Outside-Looking In: Inside-Looking Out: Cameras (typically fixed) in the environment track a marked point. PPT tracker from WorldViz (www.worldviz.com) Older optical trackers Inside-Looking Out: Cameras carried by participant, tracking makers (typically fixed) in the environment Intersense Optical Tracker 3rdTech HiBall Tracker Image from: High-Performance Wide- Area Optical Tracking The HiBall Tracking System, Welch, et. al. 1999. Jan 9

Outside Looking In Optical Tracking Precision Point Tracking by WorldViz IR Filtered Cameras are calibrated Each frame: Get latest images of point Generate a ray (in world coordinates) through the point on the image plane Triangulate to get position Jan 9

Outside Looking In Optical Tracking What factors play a role in O-L-I tracking? Camera resolution Frame rate Camera calibration Occlusion CCD Quality How does it do for: Position stable, very good Orientation Unstable, poor Latency Cameras are 60Hz Jan 9

Orientation Since orientation is poor, you can get an orientation only sensor (ex. Intersense’s InertiaCube) Called a ‘hybrid tracker’ or ‘multi-modal tracker’ Position: vision Orientation: inertial Jan 9

Inside-Looking-Out Optical Tracking The tracking device carries the camera that tracks markers in the environment. Intersense Tracker 3rdTech HiBall Tracker Images from: High-Performance Wide- Area Optical Tracking The HiBall Tracking System, Welch, et. al. 1999. Jan 9

HiBall Tracker Position Orientation Latency Pretty good Very good LEPDs can operate at 1500 Hz Six Lateral Effect Photo Dioides (LEPDs) in HiBall. Think 6 cameras. Jan 9

Angle Measurement Measurement of the bend of various joints in the user’s body Used for: Reconstruction of the position of various body parts (hand, torso). Measurement of the motion of the human body (medical) Gestural Interfaces Jan 9

Angle Measurement Technology Optical Sensors Have an emitter on one end and a receiver on the other. As the sensor is bent, the amount of light that gets from the emitter to the receiver is attenuated in a way that is determined by the angle of the bend. Examples: Flexible hollow tubes, optical fibers VPL Data Glove Jan 9

Angle Measurement Technology (cont.) Strain Sensors Measure the mechanical strain as the sensor is bent. May be mechanical or electrical in nature. Cyberglove (Virtual Technologies) Jan 9

Joints and Cyberglove Sensors Proximal Inter- phalangeal Joint (PIP) Interphalangeal Joint (IP) Metacarpophalangeal Joint (MCP) Metacarpophalangeal Joint (MCP) Abduction Sensors Thumb Rotation Sensor Jan 9

Cyberglove Accuracy Jan 9

Cyberglove Accuracy (Adj.) Jan 9

Angle Measurement Technology (cont.) Exoskeletal Structures Sensors which attach a rigid jointed structure to the body segments on either side of a joint. As the joint bends, the angle between the body segments is measured via potentiometers or optical encoders in the joints of the exoskeleton. Exos Dexterous Hand Master Jan 9

Other Techniques Pinch Gloves Have sensor contacts on the ends of each finger Jan 9

Technology Dataglove Monkey Mechanical motion capture High accuracy Low accuracy Focused resolution Monkey High accuracy High data rate Not realistic motion No paid actor Jan 9

Technology Exoskeleton + angle sensors Analogous Tethered No identification problem Realtime - 500Hz No range limit - Fit Rigid body approximation Jan 9