Download presentation
Presentation is loading. Please wait.
Published byLizbeth Henderson Modified over 9 years ago
1
1 References: 1. J.M. Hart, Windows System Programming, 4th Ed., Addison-Wesley, 2010, Ch.12 2.Microsoft Kinect SDK for Developers, http://kinectforwindows.org/ 3.Kinect, Wikipedia Department of ELECTRONIC AND INFORMATION ENGINEERING 6. Ogre and Kinect by Dr Daniel Lun Lecture 6 Ogre and Kinect EIE360 Integrated Project
2
Architecture of the Interactive Virtual Aquarium System 2 USB port 3D Graphics System 3D Graphics Your program Your program Network Computer A Computer B Kinect Sensor Device Department of ELECTRONIC AND INFORMATION ENGINEERING 6. Ogre and Kinect by Dr Daniel Lun
3
3 Kinect for Xbox 360 Sensor Kinect, originally known by the code name Project Natal, is a motion sensing input device by Microsoft for the Xbox 360 video game console Enable users to control and interact with the Xbox 360 without the need to touch a game controller through a natural user interface (NUI) using gestures and spoken commands Launched in North America on November 4, 2010 After selling a total of 8 million units in its first 60 days, the Kinect holds the Guinness World Record of being the "fastest selling consumer electronics device“! Department of ELECTRONIC AND INFORMATION ENGINEERING 6. Ogre and Kinect by Dr Daniel Lun
4
Kinect for Xbox 360 Sensor 4 Depth image sensor To measure the distance of the object from the sensor Hence allow motion capture Department of ELECTRONIC AND INFORMATION ENGINEERING 6. Ogre and Kinect by Dr Daniel Lun IR Illuminator RGB Camera Allow facial recognition Multi-array Mic Allow sound source tracking Facilitate voice recognition Motorized Tilt
5
5 Kinect for Xbox 360 Sensor Some technical data of Kinect sensor: RGB video: 8-bit VGA resolution (640x480) at 30FPS Depth sensing video stream: resolution 640x480, 320x240, 80x60, 11 bits dynamic range Hence provides 2048 levels of sensitivity when measuring depth Distance limit when using with the Xbox software: 4 – 11ft The sensor has an angular field of view of 57° horizontally and 43° vertically Since using optical sensor, can have line-of-sight problem (e.g. occluded body parts cannot be measured) Department of ELECTRONIC AND INFORMATION ENGINEERING 6. Ogre and Kinect by Dr Daniel Lun
6
6 Kinect for Windows SDK A non-commercial Kinect software development kit (SDK) for Windows was released for Windows 7 on June 2011 The SDK includes Windows 7 compatible PC drivers for Kinect device Provides Kinect capabilities to developers to build applications with C++, C#, or Visual Basic by using Microsoft Visual Studio 2010 Includes following features: Raw sensor streams, skeletal tracking, advanced audio capabilities, and sample code and documentation Department of ELECTRONIC AND INFORMATION ENGINEERING 6. Ogre and Kinect by Dr Daniel Lun
7
7 Natural User Interface (NUI) API The NUI API is the core of the Kinect for Windows API Support fundamental image and device management features, including the following: Access to the Kinect sensors connected to the computer Access to image and depth data streams from the Kinect image sensors Delivery of a processed version of image and depth data to support skeletal tracking Image Stream Audio Stream Depth Stream
8
8 NUI Skeleton Tracking The NUI Skeleton API provides information about the location of up to two players standing in front of the Kinect sensor array, with detailed position and orientation information The data is provided to application code as a set of points, called skeleton positions, that compose a skeleton Twenty skeleton positions have been identified to indicate the major joints of human body
9
Skeleton Positions 1.NUI_SKELETON_POSITION_HIP_CENTER 2.NUI_SKELETON_POSITION_SPINE 3.NUI_SKELETON_POSITION_SHOULDER_CENTER 4.NUI_SKELETON_POSITION_HEAD 5.NUI_SKELETON_POSITION_SHOULDER_LEFT 6.NUI_SKELETON_POSITION_ELBOW_LEFT 7.NUI_SKELETON_POSITION_WRIST_LEFT 8.NUI_SKELETON_POSITION_HAND_LEFT 9.NUI_SKELETON_POSITION_SHOULDER_RIGHT 10.NUI_SKELETON_POSITION_ELBOW_RIGHT 11.NUI_SKELETON_POSITION_WRIST_RIGHT 12.NUI_SKELETON_POSITION_HAND_RIGHT 13.NUI_SKELETON_POSITION_HIP_LEFT 14.NUI_SKELETON_POSITION_KNEE_LEFT 15.NUI_SKELETON_POSITION_ANKLE_LEFT 16.NUI_SKELETON_POSITION_FOOT_LEFT 17.NUI_SKELETON_POSITION_HIP_RIGHT 18.NUI_SKELETON_POSITION_KNEE_RIGHT 19.NUI_SKELETON_POSITION_ANKLE_RIGHT 20.NUI_SKELETON_POSITION_FOOT_RIGHT 9 Department of ELECTRONIC AND INFORMATION ENGINEERING 6. Ogre and Kinect by Dr Daniel Lun
10
10 Skeletal Viewer Skeletal Viewer is one of the sample applications of the Kinect for Windows SDK Provide both video outputs of the Kinect sensor: RGB and Depth plus the skeleton of the player constructed from the detected skeleton positions Department of ELECTRONIC AND INFORMATION ENGINEERING 6. Ogre and Kinect by Dr Daniel Lun
11
The Motion Tracking Server of our project is built based on Skeletal Viewer The application is run in the server with skeleton positions extracted to send to the client program via Winsock 11 Motion Tracking Server and Skeleton Viewer Motion Tracking Server Network backbone SkeletalViewer Department of ELECTRONIC AND INFORMATION ENGINEERING 6. Ogre and Kinect by Dr Daniel Lun Server program Winsock
12
Motion Tracking Server Software Architecture of the Motion Tracking Server 12 Department of ELECTRONIC AND INFORMATION ENGINEERING 6. Ogre and Kinect by Dr Daniel Lun CSkeletalViewerApp Winsock mSkeletalViewerApp SkeletalViewer bool mSkeletonExist[2]; char mMessage[2][160]; char mMessage3D[2][320]; ServerSocket Server Program ListenOnPort() AcceptConnection() ProcessClient() CloseConnection()
13
13 Useful Parameters in CSkeletalViewerApp bool mSkeletonExist[2]; Kinect sensor can detect at most 2 players at the same time If the skeleton of a player is detected, the corresponding mSkeletonExist will be equal to true char mMessage[2][160]; If the skeleton of a player is detected, this array keeps the information of the 20 skeleton positions Each element is a POINT, which is a structure composed by 2 LONG integers (4 bytes each) struct POINT {LONG x; LONG y; }; struct POINT {LONG x; LONG y; }; Department of ELECTRONIC AND INFORMATION ENGINEERING 6. Ogre and Kinect by Dr Daniel Lun
14
Useful Parameters in CSkeletalViewerApp (cont) Each POINT in fact refers to the (x, y) coordinates of the skeleton positions in the display window of the Skeleton Viewer 14 Department of ELECTRONIC AND INFORMATION ENGINEERING 6. Ogre and Kinect by Dr Daniel Lun (0,0) (639,479)
15
char mMessage[2][160]; Useful Parameters in CSkeletalViewerApp (cont) 15 Department of ELECTRONIC AND INFORMATION ENGINEERING 6. Ogre and Kinect by Dr Daniel Lun xy … 048156152 For the 1 st player mMessage[0] For the 2 nd player mMessage[1] 16 xyxyxyxy xy 04815615216 xyxyxyxy 1. NUI_SKELETON_POSITION_HIP_CENTER 20. NUI_SKELETON_POSITION_FOOT_RIGHT …
16
Useful Parameters in CSkeletalViewerApp (cont) 16 char mMessage3D[2][320]; If the skeleton of a player is detected, this array keeps the 3D coordinates of the 20 skeleton positions of the player in the 3D space with reference to the sensor Each element of the array is a Vector4 composed by 4 float numbers (4 bytes each) Department of ELECTRONIC AND INFORMATION ENGINEERING 6. Ogre and Kinect by Dr Daniel Lun struct Vector4 { float x; float y; float z; float w; }; struct Vector4 { float x; float y; float z; float w; }; Using 4-d vector rather than 3-d vector facilitates efficient matrix operations to translate a vector (e.g. by a matrix multiplication)
17
Useful Parameters in CSkeletalViewerApp (cont) Among the 4 elements, w can be ignored in this project 17 (0,0,0,1) x y z Sensor Direction Department of ELECTRONIC AND INFORMATION ENGINEERING 6. Ogre and Kinect by Dr Daniel Lun
18
Useful Parameters in CSkeletalViewerApp (cont) 18 Department of ELECTRONIC AND INFORMATION ENGINEERING 6. Ogre and Kinect by Dr Daniel Lun char mMessage3D[2][320]; xy … 048 316316 312312 For the 1 st player mMessage3D[0] For the 2 nd player mMessage3D[1] 16 zwzwxyzw xy 048 zwxyzw 1. NUI_SKELETON_POSITION_HIP_CENTER 20. NUI_SKELETON_POSITION_FOOT_RIGHT … 308308 304304 xy zwxy 316316 312312 308308 304304
19
Sending the Kinect Data to Client Assume that a connection has been made with a client. We can send the Kinect data to the client using send() Before that, the array data need to be converted into a character string for the function send(). A simple way to convert between types is to use the function memcopy() 19 Department of ELECTRONIC AND INFORMATION ENGINEERING 6. Ogre and Kinect by Dr Daniel Lun … 01 160160 … 161161 480480 … 641641 … 642642 961961 481481 482482 mMessage3D[0] mMessage[0] mSkeletonExist[0] mMessage3D[1] mMessage[1] mSkeletonExist[1]
20
Sending the Kinect Data to Client (cont) 20 char bufferSkel[1024]; char *bufferPtr = bufferSkel; for (int i = 0; i < 2; i++) { memcpy(bufferPtr, &(mSkeletalViewerApp->mSkeletonExist[i]), sizeof(bool)); bufferPtr += sizeof(bool); memcpy(bufferPtr, &(mSkeletalViewerApp->mMessage[i]), 160); bufferPtr += 160; memcpy(bufferPtr, &(mSkeletalViewerApp->mMessage3D[i]), 320); bufferPtr += 320; } rVal = send(mClientSocket, bufferSkel, 962, 0); char bufferSkel[1024]; char *bufferPtr = bufferSkel; for (int i = 0; i < 2; i++) { memcpy(bufferPtr, &(mSkeletalViewerApp->mSkeletonExist[i]), sizeof(bool)); bufferPtr += sizeof(bool); memcpy(bufferPtr, &(mSkeletalViewerApp->mMessage[i]), 160); bufferPtr += 160; memcpy(bufferPtr, &(mSkeletalViewerApp->mMessage3D[i]), 320); bufferPtr += 320; } rVal = send(mClientSocket, bufferSkel, 962, 0); Low level byte-by-byte memory copy without considering the type Socket created by accept()
21
Architecture of the Interactive Virtual Aquarium System 21 USB port 3D Graphics System 3D Graphics Your program Your program Network Computer A Computer B Kinect Sensor Device Department of ELECTRONIC AND INFORMATION ENGINEERING 6. Ogre and Kinect by Dr Daniel Lun
22
Setting up a Winsock Client In Lecture 5, the procedure for setting up a Winsock server is discussed To communicate with the server, a Winsock client should be setting up also Procedure for setting up a Winsock client is much simpler 22 Department of ELECTRONIC AND INFORMATION ENGINEERING 6. Ogre and Kinect by Dr Daniel Lun WSAStartup(... ); //1. Initialize socket (... ); //2. Create a client socket connect(... ); //3. Connect to the server send (... ); / recv (... ); //4. Send or receive data : closesocket (... ); //5. Close the socket after using it WSACleanup (... ); //6. Free resource allocated WSAStartup(... ); //1. Initialize socket (... ); //2. Create a client socket connect(... ); //3. Connect to the server send (... ); / recv (... ); //4. Send or receive data : closesocket (... ); //5. Close the socket after using it WSACleanup (... ); //6. Free resource allocated
23
Connect() and Its Parameters 23 Department of ELECTRONIC AND INFORMATION ENGINEERING 6. Ogre and Kinect by Dr Daniel Lun mSocket = socket(AF_INET, SOCK_STREAM, IPPROTO_TCP); //If mSocket = INVALID_SOCKET, error struct sockaddr_in clientService; clientService.sin_family = AF_INET; clientService.sin_addr.s_addr = inet_addr(“128.0.0.1”); clientService.sin_port = htons(8888); iResult = connect( mSocket, (SOCKADDR*) &clientService, sizeof(clientService) ); //If iResult = SOCKET_ERROR, error mSocket = socket(AF_INET, SOCK_STREAM, IPPROTO_TCP); //If mSocket = INVALID_SOCKET, error struct sockaddr_in clientService; clientService.sin_family = AF_INET; clientService.sin_addr.s_addr = inet_addr(“128.0.0.1”); clientService.sin_port = htons(8888); iResult = connect( mSocket, (SOCKADDR*) &clientService, sizeof(clientService) ); //If iResult = SOCKET_ERROR, error Port no. of the server to be connected to IP addr. of the server
24
Architecture of the Interactive Virtual Aquarium System 24 USB port 3D Graphics System 3D Graphics Your program Your program Network Computer A Computer B Kinect Sensor Device Department of ELECTRONIC AND INFORMATION ENGINEERING 6. Ogre and Kinect by Dr Daniel Lun
25
Using the Received Kinect Data Assume that the Kinect data are successfully received from the network. We should convert them back to arrays to ease further analysis 25 Department of ELECTRONIC AND INFORMATION ENGINEERING 6. Ogre and Kinect by Dr Daniel Lun … 01 160160 … 161161 480480 … 641641 … 642642 961961 481481 482482 mSkeletonPoints[ ]mSkeletonExist[ ]mSkeletonPositions[ ] … … … …
26
Using the Received Kinect Data (cont) 26 Department of ELECTRONIC AND INFORMATION ENGINEERING 6. Ogre and Kinect by Dr Daniel Lun char *bufferPtr = buffer; for (int i = 0; i < 2; i++) { memcpy(&mSkeletonExist[i], bufferPtr, sizeof(bool)); bufferPtr += sizeof(bool); memcpy(&mSkeletonPoints[i], bufferPtr, 160); bufferPtr += 160; memcpy(mSkeletonPositions[i], bufferPtr, 320); bufferPtr += 320; } char *bufferPtr = buffer; for (int i = 0; i < 2; i++) { memcpy(&mSkeletonExist[i], bufferPtr, sizeof(bool)); bufferPtr += sizeof(bool); memcpy(&mSkeletonPoints[i], bufferPtr, 160); bufferPtr += 160; memcpy(mSkeletonPositions[i], bufferPtr, 320); bufferPtr += 320; }
27
Using the Received Kinect Data (cont) We can compare the magnitude of different data to understand the motion of player E.g. the following codes allow us to detect if the left hand of the player is raised above his head 27 Department of ELECTRONIC AND INFORMATION ENGINEERING 6. Ogre and Kinect by Dr Daniel Lun if (mSkeletonPoints[0][8].y > mSkeletonPoints[0][4].y) {... } if (mSkeletonPoints[0][8].y > mSkeletonPoints[0][4].y) {... } 8. NUI_SKELETON_POSITION_HAND_LEFT 4. NUI_SKELETON_POSITION_HEAD 1 st player Y coordinate stands for height
28
Using the Received Kinect Data (cont) We may also estimate the velocity of motion E.g. to estimate the velocity of the motion of left hand in y direction: 28 Department of ELECTRONIC AND INFORMATION ENGINEERING 6. Ogre and Kinect by Dr Daniel Lun Initialization Update screen Finish update screen processCalculation() { Ogre::Real eTime = evt.timeSinceLastFrame; float currentY = mSkeletonPoints[0][8].y; Ogre::Real velocity = (currentY – mPreviousY)/eTime; mPreviousY = currentY; } May not be accurate as it is only the instantaneous result between 2 frames May need to average the results across a few frames
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.