Joshua Fabian Tyler Young James C. Peyton Jones Garrett M. Clayton Integrating the Microsoft Kinect With Simulink: Real-Time Object Tracking Example (

Slides:



Advertisements
Similar presentations
Lean Powertrain Development Sam Akehurst, University of Bath, Powertrain & Vehicle Research Centre Funded Under EPSRC Project Codes EP/C540883/1 & EP/C540891/1EP/C540883/1EP/C540891/1.
Advertisements

A Model for Infusing Engineering and Programming Concepts in Introduction to Computer Courses at Community Colleges. Intro to Robotics and Programming.
Miroslav Hlaváč Martin Kozák Fish position determination in 3D space by stereo vision.
Verification of specifications and aptitude for short-range applications of the Kinect v2 depth sensor Cecilia Chen, Cornell University Lewis’ Educational.
Capturing Your Audience with Kinect
1. 2 LabVIEW for FRC Doug Norman National Instruments January 6, 2012.
KINECT REHABILITATION
Hi_Lite Scott Fukuda Chad Kawakami. Background ► The DARPA Grand Challenge ► The Defense Advance Research Project Agency (DARPA) established a contest.
By : Adham Suwan Mohammed Zaza Ahmed Mafarjeh. Achieving Security through Kinect using Skeleton Analysis (ASKSA)
EVENTS: INRIA Work Review Nov 18 th, Madrid.
GIS and Image Processing for Environmental Analysis with Outdoor Mobile Robots School of Electrical & Electronic Engineering Queen’s University Belfast.
Zach Allen Chris Chan Ben Wolpoff Shane Zinner Project Z: Stereo Range Finding Based on Motorola Dragonball Processor.
Game Development with Kinect
Eye Tracking Project Project Supervisor: Ido Cohen By: Gilad Ambar
Interactive Sand Art Draw Using RGB-D Sensor Presenter : Senhua Chang.
Real-Time Face Detection and Tracking Using Multiple Cameras RIT Computer Engineering Senior Design Project John RuppertJustin HnatowJared Holsopple This.
Application Programming Interface For Tracking Face & Eye Motion Team Members Tharaka Roshan Pathberiya Nimesh Saveendra Chamara Susantha Gayan Gunarathne.
Firefighter Indoor Navigation using Distributed SLAM (FINDS) Major Qualifying Project Matthew Zubiel Nick Long Advisers: Prof. Duckworth, Prof. Cyganski.
ICBV Course Final Project Arik Krol Aviad Pinkovezky.
The CarBot Project Group Members: Chikaod Anyikire, Odi Agenmonmen, Robert Booth, Michael Smith, Reavis Somerville ECE 4006 November 29 th 2005.
© 2011 Xilinx, Inc. All Rights Reserved Intro to System Generator This material exempt per Department of Commerce license exception TSU.
Page 1 | Microsoft Streams sync and coordinate mapping Kinect for Windows Video Courses.
Kinect calibration Ilya Afanasyev Facoltà di Ingegneria Trento, /20 25/01/2012.
A Brief Overview of Computer Vision Jinxiang Chai.
VIRTUAL PROTOTYPING of ROBOTS DYNAMICS E. Tarabanov.
Abstract Design Considerations and Future Plans In this project we focus on integrating sensors into a small electrical vehicle to enable it to navigate.
Estimation of physical properties of real world objects Rohan Chabra & Akash Bapat.
Teaching with MATLAB - Tips and Tricks
Kinect Part II Anna Loparev.
SUBMITTED TO SUBMITTED BY Lect. Sapna Gambhir Neha MNW-888-2k11 CN.
Professor : Yih-Ran Sheu Student’s name : Nguyen Van Binh Student ID: MA02B203 Kinect camera 1 Southern Taiwan University Department of Electrical Engineering.
Introduction Kinect for Xbox 360, referred to as Kinect, is developed by Microsoft, used in Xbox 360 video game console and Windows PCs peripheral equipment.
Multiple Autonomous Ground/Air Robot Coordination Exploration of AI techniques for implementing incremental learning. Development of a robot controller.
Ruslan Masinjila Aida Militaru.  Nature of the Problem  Our Solution: The Roaming Security Robot  Functionalities  General System View  System Design.
Zhengyou Zhang Microsoft Research Digital Object Identifier: /MMUL Publication Year: 2012, Page(s): Professor: Yih-Ran Sheu Student.
1. Introduction Motion Segmentation The Affine Motion Model Contour Extraction & Shape Estimation Recursive Shape Estimation & Motion Estimation Occlusion.
Leslie Luyt Supervisor: Dr. Karen Bradshaw 2 November 2009.
Upgrade to Real Time Linux Target: A MATLAB-Based Graphical Control Environment Thesis Defense by Hai Xu CLEMSON U N I V E R S I T Y Department of Electrical.
Landing a UAV on a Runway Using Image Registration Andrew Miller, Don Harper, Mubarak Shah University of Central Florida ICRA 2008.
Project Guide: Mr. B.RAVINDER Assistant Professor (CSE) Batch No.: CSB33 Team Members: D. Sai Goud A0569 Satya Swarup Sahoo A0574 G. Shivadeep.
INTRODUCTION Generally, after stroke, patient usually has cerebral cortex functional barrier, for example, the impairment in the following capabilities,
Object Tracking Using Autonomous Quad Copter Carlos A Munoz, Advisor: Dr. Tarek Sobh Robotics, Intelligent Sensing & Control (RISC) Lab., School of Engineering,
Page 1 | Microsoft Work With Color Data Kinect for Windows Video Courses Jan 2013.
JCOP Workshop September 8th 1999 H.J.Burckhart 1 ATLAS DCS Organization of Detector and Controls Architecture Connection to DAQ Front-end System Practical.
MATLAB
HARDWARE INTERFACE FOR A 3-DOF SURGICAL ROBOT ARM Ahmet Atasoy 1, Mehmed Ozkan 2, Duygun Erol Barkana 3 1 Institute of Biomedical Engineering, Bogazici.
Web Controlled of Robot Georgi Chakarov Ivelin Stoyanov.
M.S. Thesis Defense Jason Anderson Electrical and Computer Engineering Dept. Clemson University.
Boundary Assertion in Behavior-Based Robotics Stephen Cohorn - Dept. of Math, Physics & Engineering, Tarleton State University Mentor: Dr. Mircea Agapie.
STATEFLOW AND SIMULINK TO VERILOG COSIMULATION OF SOME EXAMPLES
Kinect & 3D Scanning Mark Breedveld
Professor : Tsung Fu Chien Student’s name : Nguyen Trong Tuyen Student ID: MA02B208 An application Kinect camera controls Vehicles by Gesture 1 Southern.
EEC 490 GROUP PRESENTATION: KINECT TASK VALIDATION Scott Kruger Nate Dick Pete Hogrefe James Kulon.
Realtime Robotic Radiation Oncology Brian Murphy 4ECE.
Final Presentation for EE7700 DVP Shenghua Wan and Kang zhang May, D View Simulation Based on Face Tracking.
0 Test Slide Text works. Text works. Graphics work. Graphics work.
CONTENT 1. Introduction to Kinect 2. Some Libraries for Kinect 3. Implement 4. Conclusion & Future works 1.
Vizard Virtual Reality Toolkits Vizard Virtual Reality Toolkits.
Realtime Robotic Radiation Oncology Brian Murphy 4 th Electronic & Computer Engineering.
August 2003 At A Glance The IRC is a platform independent, extensible, and adaptive framework that provides robust, interactive, and distributed control.
Introduction to Kinect For Windows SDK
1 ® ® Agenda 8:30 a.m.Introduction to The MathWorks, Xilinx, and Avnet 9:00 a.m.Video System Design with Simulink 9:45 a.m.Break 10:00 a.m.FPGA Implementation.
ECE4006 Senior Design Project Linda Milor and Jay Schlag
Date of download: 7/9/2016 Copyright © 2016 SPIE. All rights reserved. High level representations of the two data flows. (a) The state-of-the-art data.
Southern Taiwan University Department of Electrical Engineering
Prototyping SoC-based Gate Drive Logic for Power Convertors by Generating code from Simulink models. Researchers Rounak Siddaiah, Graduate Student-University.
PSCAD models.
Contents Team introduction Project Introduction Applicability
RGBD Camera Integration into CamC Computer Integrated Surgery II Spring, 2015 Han Xiao, under the auspices of Professor Nassir Navab, Bernhard Fuerst and.
Programming HCI Yingcai Xiao Yingcai Xiao.
Presentation transcript:

Joshua Fabian Tyler Young James C. Peyton Jones Garrett M. Clayton Integrating the Microsoft Kinect With Simulink: Real-Time Object Tracking Example ( Villanova University,PA,USA )

Introduction VU-Kinect Block Using The VU-Kinect Block Depth Camera Calibration Extracting Position From RGB Camera Image Depth/RGB Camera Registration Simulink Model Experimental Results Conclusion 2

Introduction VU-Kinect Block Using The VU-Kinect Block Depth Camera Calibration Extracting Position From RGB Camera Image Depth/RGB Camera Registration Simulink Model Experimental Results Conclusion 3

1. To develop a “VU-Kinect” block to be incorporated seamlessly within a higher level, Simulink-based, image-processing and real- time control strategy. 2. To address implementation issues associated with the Kinect, such as sensor calibration. 3. To show the utility of both the VU-Kinect block and the Kinect itself through a simple 3-D object tracking example. 4

Introduction the Microsoft Kinect, has considerable potential in autonomous system applications. To date, the majority of Kinect applications have been coded in C But the use of higher-level control and image processing design languages are now commonplace both in academia and industry. These tools allow even inexperienced users to simulate their designs, then implement them on target hardware (using automatic code generation), and finally to tune system parameters while the code is actually running in real time. 5

In particular, MATLAB and Simulink provide a widely used environment for designing, simulating, and implementing control and image-processing algorithms. Simulink, developed by MathWorks, is a data flow graphical programming language tool for modeling, simulating and analyzing multi-domain dynamic systems. 6

7 Simulink model of a wind turbine

Libfreenect API Libfreenect is mainly a driver which exposes the Kinect device's features: - depth stream - IR stream - color(RGB) stream - motor control - LED control – accelerometer. Does not provide any advanced processing features like scene segmentation, skeleton tracking, etc. 8

Introduction VU-Kinect Block Using The VU-Kinect Block Depth Camera Calibration Extracting Position From RGB Camera Image Depth/RGB Camera Registration Simulink Model Experimental Results Conclusion 9

VU-Kinect Block(Villanova University Real-Time Kinect) A application which streams parallel camera and depth images from the Kinect into the user’s Simulink model. The (VU-Kinect) block provides a high-level interface to the Kinect hardware for Simulink users, as well as the low-level back-end code necessary to interface to the libfreenect API 10 Simulink Model VU-Kinect Block libfreenect API Kinect

11 Tune Parameters + View/Log Results in Real Time

Introduction VU-Kinect Block Using The VU-Kinect Block Depth Camera Calibration Extracting Position From RGB Camera Image Depth/RGB Camera Registration Simulink Model Experimental Results Conclusion 12

Details of the calibration of the Kinect sensor, the experimental setup, and experimental results are presented. Calibration: 1) Depth Camera Calibration 2) Extracting Position From RGB Camera Image 3) Depth/RGB Camera Registration 13

14 Top of view Depth calibration curve relating Kinect depth output to distance in centimeters.

15 The pendulum will be tracked in Kinect coordinates, so the pixel coordinates in the RGB image need to be converted to this coordinate system. Projective Coordinate System Real World Coordinate System

16 H and W are the height and width of the image in pixels

The Kinect device uses separate cameras for the RGB and depth videos, causing misalignment between the two images. 17 (a) unregistered (b) registered images.

18

Introduction VU-Kinect Block Using The VU-Kinect Block Depth Camera Calibration Extracting Position From RGB Camera Image Depth/RGB Camera Registration Simulink Model Experimental Results Conclusion 19

Simulink model is capable of the complicated task of tracking a pendulum in 3 dimensions.( Without the need to develop C code.) 20

21

Introduction VU-Kinect Block Using The VU-Kinect Block Depth Camera Calibration Extracting Position From RGB Camera Image Depth/RGB Camera Registration Simulink Model Experimental Results Conclusion 22

Case 1: Approximate 2-D motion in a plane parallel to the Kinect 23 Kinect Z X Y

Case 2: Approximate 2-D motion in a plane perpendicular to the Kinect 24 Kinect X Z Y

Case 3: 3-D elliptical motion. 25 Kinect X Z Y

26 Experimental position plots measured using the Kinect.

Approximate 2-D motion in a plane parallel to the Kinect 27

Approximate 2-D motion in a plane perpendicular to the Kinect 28

3-D elliptical motion. 29

30

Introduction VU-Kinect Block Using The VU-Kinect Block Depth Camera Calibration Extracting Position From RGB Camera Image Depth/RGB Camera Registration Simulink Model Experimental Results Conclusion 31

The VU-Kinect block was used to track the 3-D motion of a pendulum with great success. The Kinect in conjunction with Simulink’s real-time workshop toolbox will enable the easy programming of remote targets platforms. (e.g.mobile robots). 32