EEC-693/793 Applied Computer Vision with Depth Cameras

Slides:



Advertisements
Similar presentations
EEC-492/592 Kinect Application Development Lecture 15 Wenbing Zhao
Advertisements

Connect with your. Wow! Motorized Tilt RGB Camera 3D Depth Sensors Microphone Array.
CPVR 2013 Tutorial. Native Managed Applications Toolkit Drivers Runtime Skeletal Tracking.
EEC-492/592 Kinect Application Development Lecture 16 Wenbing Zhao
Computer and Programming File I/O File Input/Output Author: Chaiporn Jaikaeo, Jittat Fakcharoenphol Edited by Supaporn Erjongmanee Lecture 13.
EEC-492/592 Kinect Application Development Lecture 10 Wenbing Zhao
Coding4Fun: Build Fun, Cool, Commercial Applications Using the Kinect for Windows SDK Dan Fernandez Director Microsoft Corporation Rick Barraza Senior.
1 EEC-492/592 Kinect Application Development Lecture 2 Wenbing Zhao
Connect with your. Hi I’m Justin Weinberg I’m a Carnegie Mellon Graduate Manager Consultant at Sogeti Founder of the North Texas Silverlight User Group.
12/5/2015 EEC492/693/793 - iPhone Application Development 1 EEC-693/793 Applied Computer Vision with Depth Cameras Lecture 4 Wenbing Zhao
Coding4Fun: Build Fun, Cool, Commercial Applications Using the Kinect for Windows SDK Dan Fernandez Director Microsoft Corporation Rick Barraza Senior.
INO301. BGR32 Format – Every Pixel (0,0 | 0,1 | 0,2) has blue, green, red, empty BGREmpty Kinect Image Sizes: 80x60, 320x240, 640x480.
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras Lecture 8 Wenbing Zhao
2/16/2016 EEC492/693/793 - iPhone Application Development 1 EEC-693/793 Applied Computer Vision with Depth Cameras Lecture 4 Wenbing Zhao
3/3/2016 EEC492/693/793 - iPhone Application Development 1 EEC-693/793 Applied Computer Vision with Depth Cameras Lecture 4 Wenbing Zhao
1 Working with Controls at Run Time. 2 Objectives You will be able to Add controls to a Windows form at run time. Modify controls at run time.
Coding4Fun: Build Fun, Cool, Commercial Applications Using the Kinect for Windows SDK Dan Fernandez Director Microsoft Corporation Brian Peek Senior Technical.
KINECT WITH SCRATCH. Getting Started: Initial setup Be kind and gentle with the Kinect Setup Install the monitor mount Attach the Kinect Plug the Kinect.
EEC-693/793 Applied Computer Vision with Depth Cameras Lecture 9 Wenbing Zhao
BGR32 Format – Every Pixel (0,0 | 0,1 | 0,2) has blue, green, red, empty BGREmpty Kinect Image Sizes: 80x60, 320x240, 640x480 DPI: 96.
EEC-492/592 Kinect Application Development
Introduction to Microsoft Kinect Sensor Programming
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
CS 177 Week 15 Recitation Slides
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
null, true, and false are also reserved.
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-492/592 Kinect Application Development
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
Control Structures Part C – Groups of Controls
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
Presentation transcript:

EEC-693/793 Applied Computer Vision with Depth Cameras Lecture 17 Wenbing Zhao wenbing@ieee.org

Outline Skeleton logging Replaying logged skeleton data

Skeleton Joint Data Logging You can add logging for skeleton joint data into any of your existing project with skeleton tracking enabled Make sure you add the System.IO into your namespace We will use StreamWriter class for logging to a comma separated value file (.csv). Add the following member variable to your MainWindow class: using System.IO; static String filename = "kinect.csv"; StreamWriter sw = File.CreateText(filename); 3

Skeleton Joint Data Logging Add captions for columns. Add the following line in WindowLoaded() Log per frame information, including Timestamp Frame number Floor clip plane sw.WriteLine("FrameNo,timestsamp,MC-X,MC-Y,MC-Z,MC-D,HipCenter-X,HipCenter-Y,HipCenter-Z,HipCenter-D,Spine-X,Spine-Y,Spine-Z,Spine-D,ShoulderCenter-X,ShoulderCenter-Y,ShoulderCenter-Z,ShoulderCenter-D,Head-X,Head-Y,Head-Z,Head-D,ShoulderLeft-X,ShoulderLeft-Y,ShoulderLeft-Z,ShoulderLeft-D,ElbowLeft-X,ElbowLeft-Y,ElbowLeft-Z,ElbowLeft-D,WristLeft-X,WristLeft-Y,WristLeft-Z,WristLeft-D,HandLeft-X,HandLeft-Y,HandLeft-Z,HandLeft-D,ShoulderRight-X,ShoulderRight-Y,ShoulderRight-Z,ShoulderRight-D,ElbowRight-X,ElbowRight-Y,ElbowRight-Z,ElbowRight-D,WristRight-X,WristRight-Y,WristRight-Z,WristRight-D,HandRight-X,HandRight-Y,HandRight-Z,HandRight-D,HipLeft-X,HipLeft-Y,HipLeft-Z,HipLeft-D,KneeLeft-X,KneeLeft-Y,KneeLeft-Z,KneeLeft-D,AnkleLeft-X,AnkleLeft-Y,AnkleLeft-Z,AnkleLeft-D,FootLeft-X,FootLeft-Y,FootLeft-Z,FootLeft-D,HipRight-X,HipRight-Y,HipRight-Z,HipRight-D,KneeRight-X,KneeRigt-Y,KneeRight-Z,KneeRight-D,AnkleRight-X,AnkleRight-Y,AnkleRight-Z,AnkleRight-D,FootRight-X,FootRight-Y,FootRight-Z,FootRight-D"); 4

Per Frame Information void skeletonFrameReady(object sender, SkeletonFrameReadyEventArgs e) { canvas1.Children.Clear(); long timestamp = -1; int frameno = -1; float A = 0; float B = 0; float C = 0; float D = 0; using (SkeletonFrame skeletonFrame = e.OpenSkeletonFrame()) // check for frame drop. if (skeletonFrame == null) return; } 5

Per Frame Info timestamp = skeletonFrame.Timestamp; frameno = skeletonFrame.FrameNumber; A = skeletonFrame.FloorClipPlane.Item1; B = skeletonFrame.FloorClipPlane.Item2; C = skeletonFrame.FloorClipPlane.Item3; D = skeletonFrame.FloorClipPlane.Item4; // copy the frame data in to the collection skeletonFrame.CopySkeletonDataTo(totalSkeleton); // get the first Tracked skeleton skeleton = (from trackskeleton in totalSkeleton where trackskeleton.TrackingState == SkeletonTrackingState.Tracked select trackskeleton).FirstOrDefault(); // if the first skeleton returns null if (skeleton == null) return; 6

Log Per Frame Info if (skeleton != null && this.currentSkeletonID != skeleton.TrackingId) { this.currentSkeletonID = skeleton.TrackingId; int totalTrackedJoints = skeleton.Joints.Where(item => item.TrackingState == JointTrackingState.Tracked).Count(); string TrackedTime = DateTime.Now.ToString("hh:mm:ss"); string status = "Skeleton Id: " + this.currentSkeletonID + ", total tracked joints: " + totalTrackedJoints + ", TrackTime: " + TrackedTime+"\n"; this.textBlock1.Text += status; } DrawSkeleton(skeleton); sw.Write(frameno+”,”); sw.Write(timestamp+","); 7

Log Joint Info float xc = skeleton.Position.X; float yc = skeleton.Position.Y; float zc = skeleton.Position.Z; float cmd = A * xc + B * yc + C * zc + D; sw.Write(xc + "," + yc + "," + zc + "," + cmd + ","); for (int i = 0; i < 20; i++) { float x = skeleton.Joints[(JointType)i].Position.X; float y = skeleton.Joints[(JointType)i].Position.Y; float z = skeleton.Joints[(JointType)i].Position.Z; float dist = A * x + B * y + C * z + D; sw.Write(x + "," + y + "," + z + "," + dist + ","); } sw.WriteLine(""); 8

A Sample Log If you have Excel in your computer, you can double click the csv file to open it 9

Replay Logged Data We will add one button to stop realtime tracking and start replying the logged skeleton data You will be asked to improve this app as a required exercise On clicking the button, we close the write stream, open the log file, then load all the data in a jagged array We play a trick on replay, which should be changed for a pure replay app: we reply each logged frame on every color frame received Add two new member variables in MainWindow.xaml.cs List<float[][]> exdata = new List<float[][]>(); Boolean inReplay = false; int guideCounter = 0; 10

In the MainWindow.xaml.cs Replay button click handler: private void button1_Click(object sender, RoutedEventArgs e) { inReplay = true; sw.Close(); float[][] guideData = getGuideData(filename); exdata.Add(guideData); } Change the skeletonFrameReplay() slightly: stop handling new frames when replay has started void skeletonFrameReady(object sender, SkeletonFrameReadyEventArgs e) { // Add this at the beginning of the method! if (inReplay) return; 11

In the MainWindow.xaml.cs Modify colorFrameReady() for replay void colorFrameReady(object sender, ColorImageFrameReadyEventArgs e) { // other stuff if(inReplay) DisplayNextFrame(); } 12

In the MainWindow.xaml.cs     private float[][] getGuideData(String filename)     {         String[] dataStrings = null;                 if (File.Exists(filename))         {             dataStrings = File.ReadAllLines(filename);         }         else         {            return null;         }                  // the first row is the caption, skip it, hence the -1         int size = dataStrings.Length-1;         float[][] allSamples = new float[size][]; 20 joints + misc. Frame Time Jagged Array: allSamples 13

In the MainWindow.xaml.cs         // the first row is the caption, skip it         for (int i = 0; i < allSamples.Length; i++)         {             String[] row = dataStrings[i + 1].Split(',');             int rowlen = 1+20*3; // 1 column for ts, 20 sets of joints (x,y,z)             allSamples[i] = new float[rowlen];             int k = 0;            for(int j=0; j<row.Length; j++) {             // column 0 is for frame number              if(j==0)                  continue;              // column 2,3,4,5 are for center of mass              if(j >= 2 && j <=5)                  continue;             // for each joint, the 4th column is for vertical distance, skip it too             if(j > 6 && ((j-5) % 4)==0)                  continue;             allSamples[i][k++]=(float)Convert.ToDouble(row[j]);             if(k == 61)                  break; }     } return allSamples; } 14

In the MainWindow.xaml.cs void DisplayNextFrame() { canvas1.Children.Clear(); float[][] guideData = exdata[0]; float[] frame; SkeletonPoint hipcenter = new SkeletonPoint(); SkeletonPoint spine = new SkeletonPoint(); SkeletonPoint head = new SkeletonPoint(); SkeletonPoint shouldercenter = new SkeletonPoint(); SkeletonPoint shoulderleft = new SkeletonPoint(); SkeletonPoint elbowleft = new SkeletonPoint(); SkeletonPoint wristleft = new SkeletonPoint(); SkeletonPoint handleft = new SkeletonPoint(); SkeletonPoint shoulderright = new SkeletonPoint(); SkeletonPoint elbowright = new SkeletonPoint(); SkeletonPoint wristright = new SkeletonPoint(); SkeletonPoint handright = new SkeletonPoint(); SkeletonPoint hipleft = new SkeletonPoint(); SkeletonPoint kneeleft = new SkeletonPoint(); SkeletonPoint ankleleft = new SkeletonPoint(); SkeletonPoint footleft = new SkeletonPoint(); SkeletonPoint hipright = new SkeletonPoint(); SkeletonPoint kneeright = new SkeletonPoint(); SkeletonPoint ankleright = new SkeletonPoint(); SkeletonPoint footright = new SkeletonPoint(); 15

In the MainWindow.xaml.cs if (guideCounter < guideData.Length) { frame = guideData[guideCounter]; if (null == frame) { return; } int i = 0; hipcenter.X = frame[1 + 3 * i]; hipcenter.Y = frame[2 + 3 * i]; hipcenter.Z = frame[3 + 3 * i++]; spine.X = frame[1 + 3 * i]; spine.Y = frame[2 + 3 * i]; spine.Z = frame[3 + 3 * i++]; shouldercenter.X = frame[1 + 3 * i]; shouldercenter.Y = frame[2 + 3 * i]; shouldercenter.Z = frame[3 + 3 * i++]; head.X = frame[1 + 3 * i]; head.Y = frame[2 + 3 * i]; head.Z = frame[3 + 3 * i++]; shoulderleft.X = frame[1 + 3 * i]; shoulderleft.Y = frame[2 + 3 * i]; shoulderleft.Z = frame[3 + 3 * i++]; elbowleft.X = frame[1 + 3 * i]; elbowleft.Y = frame[2 + 3 * i]; elbowleft.Z = frame[3 + 3 * i++]; wristleft.X = frame[1 + 3 * i]; wristleft.Y = frame[2 + 3 * i]; wristleft.Z = frame[3 + 3 * i++]; handleft.X = frame[1 + 3 * i]; handleft.Y = frame[2 + 3 * i]; handleft.Z = frame[3 + 3 * i++]; shoulderright.X = frame[1 + 3 * i]; shoulderright.Y = frame[2 + 3 * i]; shoulderright.Z = frame[3 + 3 * i++]; elbowright.X = frame[1 + 3 * i]; elbowright.Y = frame[2 + 3 * i]; elbowright.Z = frame[3 + 3 * i++]; wristright.X = frame[1 + 3 * i]; wristright.Y = frame[2 + 3 * i]; wristright.Z = frame[3 + 3 * i++]; handright.X = frame[1 + 3 * i]; handright.Y = frame[2 + 3 * i]; handright.Z = frame[3 + 3 * i++]; hipleft.X = frame[1 + 3 * i]; hipleft.Y = frame[2 + 3 * i]; hipleft.Z = frame[3 + 3 * i++]; kneeleft.X = frame[1 + 3 * i]; kneeleft.Y = frame[2 + 3 * i]; kneeleft.Z = frame[3 + 3 * i++]; ankleleft.X = frame[1 + 3 * i]; ankleleft.Y = frame[2 + 3 * i]; ankleleft.Z = frame[3 + 3 * i++]; footleft.X = frame[1 + 3 * i]; footleft.Y = frame[2 + 3 * i]; footleft.Z = frame[3 + 3 * i++]; hipright.X = frame[1 + 3 * i]; hipright.Y = frame[2 + 3 * i]; hipright.Z = frame[3 + 3 * i++]; kneeright.X = frame[1 + 3 * i]; kneeright.Y = frame[2 + 3 * i]; kneeright.Z = frame[3 + 3 * i++]; ankleright.X = frame[1 + 3 * i]; ankleright.Y = frame[2 + 3 * i];ankleright.Z = frame[3 + 3 * i++]; footright.X = frame[1 + 3 * i]; footright.Y = frame[2 + 3 * i]; footright.Z = frame[3 + 3 * i++]; 16

In the MainWindow.xaml.cs guideCounter++; drawBone(head, shouldercenter); drawBone(shouldercenter, spine); drawBone(shouldercenter, shoulderleft); drawBone(shoulderleft, elbowleft); drawBone(elbowleft, wristleft); drawBone(wristleft, handleft); drawBone(shouldercenter, shoulderright); drawBone(shoulderright, elbowright); drawBone(elbowright, wristright); drawBone(wristright, handright); drawBone(spine, hipcenter); drawBone(hipcenter, hipleft); drawBone(hipleft, kneeleft); drawBone(kneeleft, ankleleft); drawBone(ankleleft, footleft); drawBone(hipcenter, hipright); drawBone(hipright, kneeright); drawBone(kneeright, ankleright); drawBone(ankleright, footright); } if (guideCounter >= guideData.Length) guideCounter = 0; 17

In the MainWindow.xaml.cs void drawBone(SkeletonPoint trackedJoint1, SkeletonPoint trackedJoint2) { Line bone = new Line(); bone.Stroke = Brushes.Red; bone.StrokeThickness = 3; Point joint1 = this.ScalePosition(trackedJoint1); bone.X1 = joint1.X; bone.Y1 = joint1.Y; Point joint2 = this.ScalePosition(trackedJoint2); bone.X2 = joint2.X; bone.Y2 = joint2.Y; canvas1.Children.Add(bone); } 18

Required Exercise In the current app, once you start replaying, you can no longer track skeleton anymore. Add a button to switch to the tracking mode For logging, add a start and stop button for starting and stopping the logging For replaying, also add a start and stop button 19