EEC-693/793 Applied Computer Vision with Depth Cameras

Slides:



Advertisements
Similar presentations
Page 1 | Microsoft Work With Depth Data Kinect for Windows Video Courses Jan 2013.
Advertisements

EEC-492/592 Kinect Application Development Lecture 15 Wenbing Zhao
Work With Skeleton Data
CPVR 2013 Tutorial. Native Managed Applications Toolkit Drivers Runtime Skeletal Tracking.
By Rishabh Maheshwari. Objective of today’s lecture Play Angry Birds in 3D.
EEC-492/592 Kinect Application Development Lecture 16 Wenbing Zhao
EEC-492/592 Kinect Application Development
EEC-492/592 Kinect Application Development Lecture 10 Wenbing Zhao
Kinect SDK Crash Course (In 12 slides or less) Elliot Babchick.
Tom Ritsert Dave Galey.  With a single Kinect sensor, skeletal tracking becomes difficult if there are obstacles in the field of view  Extra sensors.
1 EEC-492/592 Kinect Application Development Lecture 2 Wenbing Zhao
12/5/2015 EEC492/693/793 - iPhone Application Development 1 EEC-693/793 Applied Computer Vision with Depth Cameras Lecture 4 Wenbing Zhao
KINECT FOR WINDOWS Ken Casada Developer Evangelist, Microsoft Switzerland | blogblog.
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras Lecture 8 Wenbing Zhao
2/16/2016 EEC492/693/793 - iPhone Application Development 1 EEC-693/793 Applied Computer Vision with Depth Cameras Lecture 4 Wenbing Zhao
3/3/2016 EEC492/693/793 - iPhone Application Development 1 EEC-693/793 Applied Computer Vision with Depth Cameras Lecture 4 Wenbing Zhao
EEC-693/793 Applied Computer Vision with Depth Cameras Lecture 9 Wenbing Zhao
Data Binding, Binding Properties Doncho Minkov Telerik School Academy Technical Trainer
EEC-492/592 Kinect Application Development
EEC-693/793 Applied Computer Vision with Depth Cameras
Introduction to Microsoft Kinect Sensor Programming
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
Chapter 3: Introduction to Problem Solving and Control Statements
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-492/592 Kinect Application Development
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
Data Structures & Algorithms
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
Computer Graphics Matrix Hierarchies / Animation
EEC-693/793 Applied Computer Vision with Depth Cameras
Presentation transcript:

EEC-693/793 Applied Computer Vision with Depth Cameras Lecture 9 Wenbing Zhao wenbing@ieee.org 1

Outline Human skeleton tracking (part II)

Skeleton Tracking in Near Mode Must enable near mode for DepthStream and near mode tracking for SkeletonStream By default EnableTrackingInNearRange is set to false this.sensor.DepthStream.Range = DepthRange.Near; this.sensor.SkeletonStream.EnableTrackingInNearRange = true;

The Skeleton Class The Skeleton class has a property named ClippedEdges, which is of type FrameEdges, that describes which parts of the skeleton are out of the Kinect's view. They indicate which portion of the body is getting cut off from the Kinect sensor view area

Choosing Which Skeleton to Track Each skeleton is identified by a unique integer identifer: TrackingId The TrackingId for a tracked skeleton is positive (>0) The TrackingId of the skeleton that was tracked last should always be greater than the previously tracked skeleton’s Id We can use TrackingId to specify which skeletons are to be tracked or used by the application This will ensure that the skeleton tracking engine tracks only the skeleton that you have identified and ignores the others. sensor.SkeletonStream.AppChoosesSkeletons = true; sensor.SkeletonStream.ChooseSkeletons(skeleton.TrackingId);

Monitoring Changes in the Skeleton To track the development of a particular user, it is important to tell if the user has gone out of view and another user comes into view The TrackingId can be used to monitor the changes in the skeleton being tracked (assuming only one user in view) Use a local variable to store the TrackingId for the current skeleton Update it when the skeleton ID is changed public int CurrentSkeletonID =0; if (skeleton != null && this.CurrentSkeletonID != skeleton.TrackingId) { this.CurrentSkeletonID = skeleton.TrackingId; }

Monitoring Changes in the Skeleton: A Use Case Intrusion detector app: in skeleton frame event handler Set TrackingId when a new skeleton is detected, and save an image, only track the skeleton Continue tracking this skeleton without saving more images When the current skeleton is no longer tracked, reset to the default tracking mode (SDK tracks all possible skeletons in view) public int CurrentSkeletonID =0; // member variable in MainApp class

Monitoring Changes in the Skeleton: A Use Case Skeleton skeleton; // local variable in event handler If(CurrentSkeletonID == 0) { skeleton = (from trackSkeleton in totalSkeleton where trackSkeleton.TrackingState == SkeletonTrackingState.Tracked select trackSkeleton).FirstOrDefault(); if (skeleton == null) { return; } CurrentSkeletonID = skeleton.TrackingId; this.sensor.SkeletonStream.AppChoosesSkeletons = true; this.sensor.SkeletonStream.ChooseSkeletons(CurrentSkeletonID); if (skeleton.Joints[JointType.Head].TrackingState == JointTrackingState.Tracked) { this.SaveImage();

Monitoring Changes in the Skeleton: A Use Case if (CurrentSkeletonID != 0) // we are currently tracking a skeleton { skeleton = (from trackSkeleton in totalSkeleton where trackSkeleton.TrackingState == SkeletonTrackingState.Tracked && trackSkeleton.TrackingId == CurrentSkeletonID select trackSkeleton).FirstOrDefault(); if (skeleton == null) // reset to default tracking mode CurrentSkeletonID = 0; this.sensor.SkeletonStream.AppChoosesSkeletons = false; }

Joints and JointCollection Each Skeleton object has a property named Joints, which is a type of JointCollection and contains all the traceable joints JointCollecton contains a set of Joints and can be accessed by specifying the index value When you pass JointType to get the Joint point, it will return the Joint object Skeleton skeleton = (from trackskeleton in totalSkeleton where trackskeleton.TrackingState == SkeletonTrackingState.Tracked select trackskeleton).FirstOrDefault(); Joint headJoint= skeleton.Joints[JointType.Head]);

Joint The Joint structure has three properties JointType: 20 joints Positions: SkeletonPoint type (x,y,z) TrackingState: NotTracked, Tracked, Inferred

Bones: Connecting Joints Bones are the visual representation between joints The complete hierarchy of a skeleton is composed of a series of bones Each bone in a skeleton hierarchy has a parent joint and a child joint Every joint can be a parent and child joint unless it is a leaf joint

Bone Sequence for a Default Skeleton Root joint: Highest joint in hierarchy. Each skeleton has one root joint For a default skeleton: Hip Center is the root joint We can move and orient the entire skeleton in the skeleton space by translating and rotating the root joint

Bone Sequence for a Seated Skeleton For seated skeleton: Shoulder Center is the root joint

Drawing bones between joints Bones are the visual representation between joints and can be represented by a line or any other object void drawBone(Joint trackedJoint1, Joint trackedJoint2) { Line skeletonBone = new Line(); skeletonBone.Stroke = Brushes.Black; skeletonBone.StrokeThickness = 3; Point joint1 = this.ScalePosition(trackedJoint1.Position); skeletonBone.X1 = joint1.X; skeletonBone.Y1 = joint1.Y; Point joint2 = this.ScalePosition(trackedJoint2.Position); skeletonBone.X2 = joint2.X; skeletonBone.Y2 = joint2.Y; myCanvas.Children.Add(skeletonBone); }

ClippedEdges ClippedEdges (of type FrameEdges) describes which parts of the skeleton are out of the Kinect's view: None, Right, Left, Top, Bottom Can be used to give user feedbacks on the position

Build TrackingSkeleton App Create a new C# WPF project with name TrackingSkeleton Add Microsoft.Kinect reference Design GUI Added WindowLoaded() method in xaml file Adding code

GUI Design Image control Canvas TextBlock

Adding Code Add member variables: WindowLoaded(): Enable both ColorImageStream and SkeletonStream Register event handler for both ColorFrameReady and SkeletonFrameReady events Make sure to use default skeleton tracking mode (not seated mode!) KinectSensor sensor; Skeleton[] totalSkeleton = new Skeleton[6]; WriteableBitmap colorBitmap; byte[] colorPixels; Skeleton skeleton; int currentSkeletonID = 0;

Add Event Handler for skeleton frames void skeletonFrameReady(object sender, SkeletonFrameReadyEventArgs e) { canvas1.Children.Clear(); using (SkeletonFrame skeletonFrame = e.OpenSkeletonFrame()) { if (skeletonFrame == null) { return; } skeletonFrame.CopySkeletonDataTo(totalSkeleton); skeleton = (from trackskeleton in totalSkeleton where trackskeleton.TrackingState == SkeletonTrackingState.Tracked select trackskeleton).FirstOrDefault(); if (skeleton == null) return; if (skeleton != null && this.currentSkeletonID != skeleton.TrackingId) { this.currentSkeletonID = skeleton.TrackingId; int totalTrackedJoints = skeleton.Joints.Where(item => item.TrackingState == JointTrackingState.Tracked).Count(); string TrackedTime = DateTime.Now.ToString("hh:mm:ss"); string status = "Skeleton Id: " + this.currentSkeletonID + ", total tracked joints: " + totalTrackedJoints + ", TrackTime: " + TrackedTime+"\n"; this.textBlock1.Text += status; } DrawSkeleton(skeleton); }}

Adding Code private void DrawSkeleton(Skeleton skeleton) { drawBone(skeleton.Joints[JointType.Head], skeleton.Joints[JointType.ShoulderCenter]); drawBone(skeleton.Joints[JointType.ShoulderCenter], skeleton.Joints[JointType.Spine]); drawBone(skeleton.Joints[JointType.ShoulderCenter], skeleton.Joints[JointType.ShoulderLeft]); drawBone(skeleton.Joints[JointType.ShoulderLeft], skeleton.Joints[JointType.ElbowLeft]); drawBone(skeleton.Joints[JointType.ElbowLeft], skeleton.Joints[JointType.WristLeft]); drawBone(skeleton.Joints[JointType.WristLeft], skeleton.Joints[JointType.HandLeft]); drawBone(skeleton.Joints[JointType.ShoulderCenter], skeleton.Joints[JointType.ShoulderRight]); drawBone(skeleton.Joints[JointType.ShoulderRight], skeleton.Joints[JointType.ElbowRight]); drawBone(skeleton.Joints[JointType.ElbowRight], skeleton.Joints[JointType.WristRight]); drawBone(skeleton.Joints[JointType.WristRight], skeleton.Joints[JointType.HandRight]); // more on next slide }

Adding Code private Point ScalePosition(SkeletonPoint skeletonPoint) { private void DrawSkeleton(Skeleton skeleton) { // continue from previous slid drawBone(skeleton.Joints[JointType.Spine], skeleton.Joints[JointType.HipCenter]); drawBone(skeleton.Joints[JointType.HipCenter], skeleton.Joints[JointType.HipLeft]); drawBone(skeleton.Joints[JointType.HipLeft], skeleton.Joints[JointType.KneeLeft]); drawBone(skeleton.Joints[JointType.KneeLeft], skeleton.Joints[JointType.AnkleLeft]); drawBone(skeleton.Joints[JointType.AnkleLeft], skeleton.Joints[JointType.FootLeft]); drawBone(skeleton.Joints[JointType.HipCenter], skeleton.Joints[JointType.HipRight]); drawBone(skeleton.Joints[JointType.HipRight], skeleton.Joints[JointType.KneeRight]); drawBone(skeleton.Joints[JointType.KneeRight], skeleton.Joints[JointType.AnkleRight]); drawBone(skeleton.Joints[JointType.AnkleRight], skeleton.Joints[JointType.FootRight]); } private Point ScalePosition(SkeletonPoint skeletonPoint) { DepthImagePoint depthPoint = this.sensor.CoordinateMapper.MapSkeletonPointToDepthPoint(skeletonPoint, DepthImageFormat.Resolution320x240Fps30); return new Point(depthPoint.X, depthPoint.Y); }

Adding Code void drawBone(Joint trackedJoint1, Joint trackedJoint2) { Line bone = new Line(); bone.Stroke = Brushes.Red; bone.StrokeThickness = 3; Point joint1 = this.ScalePosition(trackedJoint1.Position); bone.X1 = joint1.X; bone.Y1 = joint1.Y; Point joint2 = this.ScalePosition(trackedJoint2.Position); bone.X2 = joint2.X; bone.Y2 = joint2.Y; canvas1.Children.Add(bone); }

EEC492/693/793 - iPhone Application Development Challenge Task For advanced students, measure the angle formed between left/right arm and torso Draw an arc between the arm and torso Display the angle value with a TextBox on top of the arc 12/6/2018 EEC492/693/793 - iPhone Application Development

Helper Code for Advanced Task (https://dynamicdatadisplay. codeplex private void AddCircularArcGraph(Point startPoint, Point endPoint, Size size) { PathFigure pf = new PathFigure(); pf.StartPoint = new Point(startPoint.X, startPoint.Y); ArcSegment arcSegment = new ArcSegment(); arcSegment.Point = new Point(endPoint.X, endPoint.Y); arcSegment.Size = size; arcSegment.SweepDirection = SweepDirection.Counterclockwise; PathSegmentCollection psc = new PathSegmentCollection(); psc.Add(arcSegment); pf.Segments = psc; PathFigureCollection pfc = new PathFigureCollection(); pfc.Add(pf); PathGeometry pg = new PathGeometry(); pg.Figures = pfc; var path = new System.Windows.Shapes.Path(); path.Stroke = Brushes.Black; path.StrokeThickness = 1; path.Data = pg; canvas1.Children.Add(path); }