Download presentation
Presentation is loading. Please wait.
Published byAddison Moats Modified over 9 years ago
1
Joshua Fabian Tyler Young James C. Peyton Jones Garrett M. Clayton Integrating the Microsoft Kinect With Simulink: Real-Time Object Tracking Example ( Villanova University,PA,USA )
2
Introduction VU-Kinect Block Using The VU-Kinect Block Depth Camera Calibration Extracting Position From RGB Camera Image Depth/RGB Camera Registration Simulink Model Experimental Results Conclusion 2
3
Introduction VU-Kinect Block Using The VU-Kinect Block Depth Camera Calibration Extracting Position From RGB Camera Image Depth/RGB Camera Registration Simulink Model Experimental Results Conclusion 3
4
1. To develop a “VU-Kinect” block to be incorporated seamlessly within a higher level, Simulink-based, image-processing and real- time control strategy. 2. To address implementation issues associated with the Kinect, such as sensor calibration. 3. To show the utility of both the VU-Kinect block and the Kinect itself through a simple 3-D object tracking example. 4
5
Introduction the Microsoft Kinect, has considerable potential in autonomous system applications. To date, the majority of Kinect applications have been coded in C But the use of higher-level control and image processing design languages are now commonplace both in academia and industry. These tools allow even inexperienced users to simulate their designs, then implement them on target hardware (using automatic code generation), and finally to tune system parameters while the code is actually running in real time. 5
6
In particular, MATLAB and Simulink provide a widely used environment for designing, simulating, and implementing control and image-processing algorithms. Simulink, developed by MathWorks, is a data flow graphical programming language tool for modeling, simulating and analyzing multi-domain dynamic systems. 6
7
7 Simulink model of a wind turbine
8
Libfreenect API Libfreenect is mainly a driver which exposes the Kinect device's features: - depth stream - IR stream - color(RGB) stream - motor control - LED control – accelerometer. Does not provide any advanced processing features like scene segmentation, skeleton tracking, etc. 8
9
Introduction VU-Kinect Block Using The VU-Kinect Block Depth Camera Calibration Extracting Position From RGB Camera Image Depth/RGB Camera Registration Simulink Model Experimental Results Conclusion 9
10
VU-Kinect Block(Villanova University Real-Time Kinect) A application which streams parallel camera and depth images from the Kinect into the user’s Simulink model. The (VU-Kinect) block provides a high-level interface to the Kinect hardware for Simulink users, as well as the low-level back-end code necessary to interface to the libfreenect API 10 Simulink Model VU-Kinect Block libfreenect API Kinect
11
11 Tune Parameters + View/Log Results in Real Time
12
Introduction VU-Kinect Block Using The VU-Kinect Block Depth Camera Calibration Extracting Position From RGB Camera Image Depth/RGB Camera Registration Simulink Model Experimental Results Conclusion 12
13
Details of the calibration of the Kinect sensor, the experimental setup, and experimental results are presented. Calibration: 1) Depth Camera Calibration 2) Extracting Position From RGB Camera Image 3) Depth/RGB Camera Registration 13
14
14 Top of view Depth calibration curve relating Kinect depth output to distance in centimeters.
15
15 The pendulum will be tracked in Kinect coordinates, so the pixel coordinates in the RGB image need to be converted to this coordinate system. Projective Coordinate System Real World Coordinate System
16
16 H and W are the height and width of the image in pixels
17
The Kinect device uses separate cameras for the RGB and depth videos, causing misalignment between the two images. 17 (a) unregistered (b) registered images.
18
18
19
Introduction VU-Kinect Block Using The VU-Kinect Block Depth Camera Calibration Extracting Position From RGB Camera Image Depth/RGB Camera Registration Simulink Model Experimental Results Conclusion 19
20
Simulink model is capable of the complicated task of tracking a pendulum in 3 dimensions.( Without the need to develop C code.) 20
21
21
22
Introduction VU-Kinect Block Using The VU-Kinect Block Depth Camera Calibration Extracting Position From RGB Camera Image Depth/RGB Camera Registration Simulink Model Experimental Results Conclusion 22
23
Case 1: Approximate 2-D motion in a plane parallel to the Kinect 23 Kinect Z X Y
24
Case 2: Approximate 2-D motion in a plane perpendicular to the Kinect 24 Kinect X Z Y
25
Case 3: 3-D elliptical motion. 25 Kinect X Z Y
26
26 Experimental position plots measured using the Kinect.
27
Approximate 2-D motion in a plane parallel to the Kinect 27
28
Approximate 2-D motion in a plane perpendicular to the Kinect 28
29
3-D elliptical motion. 29
30
30
31
Introduction VU-Kinect Block Using The VU-Kinect Block Depth Camera Calibration Extracting Position From RGB Camera Image Depth/RGB Camera Registration Simulink Model Experimental Results Conclusion 31
32
The VU-Kinect block was used to track the 3-D motion of a pendulum with great success. The Kinect in conjunction with Simulink’s real-time workshop toolbox will enable the easy programming of remote targets platforms. (e.g.mobile robots). 32
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.