Presentation is loading. Please wait.

Presentation is loading. Please wait.

Using the Kinect for fun and profit. About /me Tam HANNA –Director, Tamoggemon Holding k,s –Runs web sites about mobile computing –Writes scientific books.

Similar presentations


Presentation on theme: "Using the Kinect for fun and profit. About /me Tam HANNA –Director, Tamoggemon Holding k,s –Runs web sites about mobile computing –Writes scientific books."— Presentation transcript:

1 Using the Kinect for fun and profit

2 About /me Tam HANNA –Director, Tamoggemon Holding k,s –Runs web sites about mobile computing –Writes scientific books

3 Agenda Kinect – what is that? Streams Skeletons Facial tracking libfreenect OpenNI

4 Slide download http://www.tamoggemon.com/test/ Codemotion-Kinect.ppthttp://www.tamoggemon.com/test/ URL IS case sensitive

5 Kinect – what is that?

6 History - I Depth: PrimeSense technology –Not from Redmond First public mention: 2007 –Bill Gates, D3 conference –„Camera for game control“

7 Contrast detection Where does the shirt end?

8 Dot matrix

9 Shadows / dead areas

10 Shadows / dead areas - II

11 History - II 2008: Wii ships –Best-selling console of its generation 2009: E3 conference –Announcement of „Project Natal“ 2010: no CPU in sensor –Takes 10% of XBox 360 CPU

12 History - III 4. November 2010 –First shipment –“We will sue anyone who reverse engineers“ June 2011 –Official SDK

13 System overview

14 Kinect provides Video stream Depth stream –(IR stream) Accelerometer data Rest: computedRest: computed

15 Family tree Kinect for XBOX –Normal USB Kinect bundle –MS-Fucked USB –Needs PSU Kinect for Windows –Costs more –Legal to deploy

16 Cheap from China

17 Streams

18 Kinect provides „streams“ Repeatedly updated bitmaps Push or Pull processes possible –Attention: processing time!!!

19 Color stream Two modes –VGA@30fps –1280x960@12fps Simple data format –8 bits / component –R / G / B / A components

20 Depth stream Two modes –Unlimited range –Reduced range, with player indexing

21 Depth stream - II 16bit words Special encoding for limited range:

22 Depth stream - III

23 IR stream Instead of color data 640x480@30fps 16bit words IR data in 10 MSB bits

24 Finding the Kinect SDK supports multiple Sensors/PC Find one Microsoft.Kinect.Toolkit

25 XAML part <Window x:Class="KinectWPFD2.MainWindow" xmlns:toolkit="clr- namespace:Microsoft.Kinect.Toolkit;assembly=Microsoft.Kinect.Toolkit" xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" Title="MainWindow" Height="759" Width="704">

26 Code - I public partial class MainWindow : Window { KinectSensor mySensor; KinectSensorChooser myChooser; public MainWindow() { InitializeComponent(); myChooser = new KinectSensorChooser(); myChooser.KinectChanged += new EventHandler (myChooser_KinectChanged); this.SensorChooserUI.KinectSensorChooser = myChooser; myChooser.Start();

27 Code - II void myChooser_KinectChanged(object sender, KinectChangedEventArgs e) { if (null != e.OldSensor) { if (mySensor != null) { mySensor.Dispose(); } if (null != e.NewSensor) { mySensor = e.NewSensor;

28 Initialize stream mySensor.DepthStream.Enable(DepthImageFormat.Resolution640x480Fps30); mySensor.ColorStream.Enable(ColorImageFormat.RgbResolution640x480Fps30) ; myArray = new short[this.mySensor.DepthStream.FramePixelDataLength]; myColorArray = new byte[this.mySensor.ColorStream.FramePixelDataLength]; mySensor.AllFramesReady += new EventHandler (mySensor_AllFramesReady); try { this.mySensor.Start(); SensorChooserUI.Visibility = Visibility.Hidden; }

29 Process stream void mySensor_AllFramesReady(object sender, AllFramesReadyEventArgs e) { ColorImageFrame c = e.OpenColorImageFrame(); DepthImageFrame d = e.OpenDepthImageFrame(); if (c == null || d == null) return; c.CopyPixelDataTo(myColorArray); d.CopyPixelDataTo(myArray);

30 Problem: Calibration Depth and Color sensors are not aligned Position of data in array does not match

31 Solution CoordinateMapper class Maps between various frame types –Depth and Color –Skeleton and Color

32 On Push mode Kinect can push data to application Preferred mode of operation But: sensitive to proc time If handler takes too long -> App stops

33 Skeletons

34 What is tracked? Data format –Real life coordinates Color-Mappable

35 Initialize stream if (null != e.NewSensor) { mySensor = e.NewSensor; mySensor.SkeletonStream.Enable();

36 Get joints void mySensor_AllFramesReady(object sender, AllFramesReadyEventArgs e) { ColorImageFrame c = e.OpenColorImageFrame(); SkeletonFrame s = e.OpenSkeletonFrame(); if (c == null || s == null) return; c.CopyPixelDataTo(myColorArray); s.CopySkeletonDataTo(mySkeletonArray); foreach (Skeleton aSkeleton in mySkeletonArray) { DrawBone(aSkeleton.Joints[JointType.HandLeft], aSkeleton.Joints[JointType.WristLeft], armPen, drawingContext);

37 Use joints private void DrawBone(Joint jointFrom, Joint jointTo, Pen aPen, DrawingContext aContext) { if (jointFrom.TrackingState == JointTrackingState.NotTracked || jointTo.TrackingState == JointTrackingState.NotTracked) {} if (jointFrom.TrackingState == JointTrackingState.Inferred || jointTo.TrackingState == JointTrackingState.Inferred) { ColorImagePoint p1 = mySensor.CoordinateMapper.MapSkeletonPointToColorPoint(jointFrom.Po sition, ColorImageFormat.RgbResolution640x480Fps30); } if (jointFrom.TrackingState == JointTrackingState.Tracked || jointTo.TrackingState == JointTrackingState.Tracked)

38 Facial tracking

39 What is tracked - I

40 What is tracked - II

41 What is tracked - III

42 AU‘s? Research by Paul EKMAN Quantify facial motion

43 Structure C++ library with algorithms Basic.net wrapper provided –Incomplete –Might change!!

44 Initialize face tracker myFaceTracker = new FaceTracker(mySensor);

45 Feed face tracker FaceTrackFrame myFrame = null; foreach (Skeleton aSkeleton in mySkeletonArray) { if (aSkeleton.TrackingState == SkeletonTrackingState.Tracked) { myFrame = myFaceTracker.Track(ColorImageFormat.RgbResolution640x480Fps30, myColorArray, DepthImageFormat.Resolution640x480Fps30, myArray, aSkeleton); if (myFrame.TrackSuccessful == true) { break; }

46 Calibration OUCH! –Not all snouts are equal Maximums vary

47 libfreenect

48 What is it Result of Kinect hacking competition Bundled with most Linux distributions „Basic Kinect data parser“

49 Set-up /etc/udev/rules.d/66-kinect.rules #Rules for Kinect ############################################# #######SYSFS{idVendor}=="045e", SYSFS{idProduct}=="02ae", MODE="0660",GROUP="video"SYSFS{idVendor}=="04 5e", SYSFS{idProduct}=="02ad", MODE="0660",GROUP="video"SYSFS{idVendor}=="04 5e", SYSFS{idProduct}=="02b0", MODE="0660",GROUP="video"### END ############################################# ################

50 Set-up II sudo adduser $USER plugdev sudo usermod -a -G video tamhan tamhan@tamhan-X360:~$ freenect-glview Kinect camera test Number of devices found: 1 Could not claim interface on camera: -6 Could not open device

51 Set-up III

52 Problems gspca-kinect –Kernel module, uses Kinect as webcam –Blocks other libraries –sudo modprobe -r gspca_kinect Outdated version widely deployed –API not compatible

53 Update library sudo foo sudo add-apt-repository ppa:floe/libtisch sudo apt-get update sudo apt-get install libfreenect libfreenect- dev libfreenect-demos

54 libfreenect - II color stream

55 Implementing it libfreenect: C++ library Question: which framework Answer: Qt ( what else ;) )

56 The.pro file QT += core gui TARGET = QtDepthFrame CONFIG += i386 DEFINES += USE_FREENECT LIBS += -lfreenect

57 The freenect thread Library needs processing time –Does not multithread itself Should be provided outside of main app

58 class QFreenectThread : public QThread { Q_OBJECT public: explicit QFreenectThread(QObject *parent = 0); void run(); signals: public slots: public: bool myActive; freenect_context *myContext; };

59 QFreenectThread::QFreenectThread(QObject *parent) : QThread(parent) { } void QFreenectThread::run() { while(myActive) { if(freenect_process_events(myContext) < 0) { qDebug("Cannot process events!"); QApplication::exit(1); }

60 QFreenect Main engine module –Contact point between Kinect and app Fires off signals on frame availability

61 class QFreenect : public QObject { Q_OBJECT public: explicit QFreenect(QObject *parent = 0); ~QFreenect(); void processVideo(void *myVideo, uint32_t myTimestamp=0); void processDepth(void *myDepth, uint32_t myTimestamp=0); signals: void videoDataReady(uint8_t* myRGBBuffer); void depthDataReady(uint16_t* myDepthBuffer); public slots:

62 private: freenect_context *myContext; freenect_device *myDevice; QFreenectThread *myWorker; uint8_t* myRGBBuffer; uint16_t* myDepthBuffer; QMutex* myMutex; public: bool myWantDataFlag; bool myFlagFrameTaken; bool myFlagDFrameTaken; static QFreenect* mySelf; };

63 Some C++ QFreenect* QFreenect::mySelf; static inline void videoCallback(freenect_device *myDevice, void *myVideo, uint32_t myTimestamp=0) { QFreenect::mySelf->processVideo(myVideo, myTimestamp); } static inline void depthCallback(freenect_device *myDevice, void *myVideo, uint32_t myTimestamp=0) { QFreenect::mySelf->processDepth(myVideo, myTimestamp); }

64 Bring-up QFreenect::QFreenect(QObject *parent) : QObject(parent) { myMutex=NULL; myRGBBuffer=NULL; myMutex=new QMutex(); myWantDataFlag=false; myFlagFrameTaken=true; mySelf=this; if (freenect_init(&myContext, NULL) < 0) { qDebug("init failed"); QApplication::exit(1); }

65 Bring-up – II freenect_set_log_level(myContext, FREENECT_LOG_FATAL); int nr_devices = freenect_num_devices (myContext); if (nr_devices < 1) { freenect_shutdown(myContext); qDebug("No Kinect found!"); QApplication::exit(1); } if (freenect_open_device(myContext, &myDevice, 0) < 0) { qDebug("Open Device Failed!"); freenect_shutdown(myContext); QApplication::exit(1); }

66 myRGBBuffer = (uint8_t*)malloc(640*480*3); freenect_set_video_callback(myDevice, videoCallback); freenect_set_video_buffer(myDevice, myRGBBuffer); freenect_frame_mode vFrame = freenect_find_video_mode(FREENECT_RESOLUTIO N_MEDIUM,FREENECT_VIDEO_RGB); freenect_set_video_mode(myDevice,vFrame); freenect_start_video(myDevice);

67 myWorker=new QFreenectThread(this); myWorker->myActive=true; myWorker->myContext=myContext; myWorker->start();

68 Shut-Down QFreenect::~QFreenect() { freenect_close_device(myDevice); freenect_shutdown(myContext); if(myRGBBuffer!=NULL)free(myRGBBuffer); if(myMutex!=NULL)delete myMutex; }

69 Data passing void QFreenect::processVideo(void *myVideo, uint32_t myTimestamp) { QMutexLocker locker(myMutex); if(myWantDataFlag && myFlagFrameTaken) { uint8_t* mySecondBuffer=(uint8_t*)malloc(640*480*3); memcpy(mySecondBuffer,myVideo,640*480*3); myFlagFrameTaken=false; emit videoDataReady(mySecondBuffer); }

70 Format of data word Array of bytes Three bytes = one pixel

71 Format of data word - II for(int x=2; x<640;x++) { for(int y=0;y<480;y++) { r=(myRGBBuffer[3*(x+y*640)+0]); g=(myRGBBuffer[3*(x+y*640)+1]); b=(myRGBBuffer[3*(x+y*640)+2]); myVideoImage->setPixel(x,y,qRgb(r,g,b)); }

72 libfreenect - III depth stream

73 Extra bring-up myDepthBuffer= (uint16_t*)malloc(640*480*2); freenect_set_depth_callback(myDevice, depthCallback); freenect_set_depth_buffer(myDevice, myDepthBuffer); freenect_frame_mode aFrame = freenect_find_depth_mode( FREENECT_RESOLUTION_MEDIUM, FREENECT_DEPTH_REGISTERED); freenect_set_depth_mode(myDevice,aFrame); freenect_start_depth(myDevice);

74 Extra processing void QFreenect::processDepth(void *myDepth, uint32_t myTimestamp) { QMutexLocker locker(myMutex); if(myWantDataFlag && myFlagDFrameTaken) { uint16_t* mySecondBuffer=(uint16_t*)malloc(640*480*2); memcpy(mySecondBuffer,myDepth,640*480*2); myFlagDFrameTaken=false; emit depthDataReady(mySecondBuffer); }

75 Data extraction void MainWindow::depthDataReady(uint16_t* myDepthBuffer) { if(myDepthImage!=NULL)delete myDepthImage; myDepthImage=new QImage(640,480,QImage::Format_RGB32); unsigned char r, g, b; for(int x=2; x<640;x++) { for(int y=0;y<480;y++) { int calcval=(myDepthBuffer[(x+y*640)]);

76 Data is in meters if(calcval==FREENECT_DEPTH_MM_NO_VALUE) { r=255; g=0;b=0; } else if(calcval>1000 && calcval < 2000) { QRgb aVal=myVideoImage->pixel(x,y); r=qRed(aVal); g=qGreen(aVal); b=qBlue(aVal); } else { r=0;g=0;b=0; } myDepthImage->setPixel(x,y,qRgb(r,g,b));

77 Example

78 OpenNI

79 What is OpenNI? Open standard for Natural Interfaces –Very Asus-Centric Provides generic NI framework VERY complex APIVERY complex API

80 Version 1.5 vs Version 2.0

81 Supported platforms Linux Windows –32bit only

82 Want more? Book –German language –30 Euros Launch –When it‘s done!

83 ?!? tamhan@tamoggemon.com @tamhanna Images: pedroserafin, mattbuck


Download ppt "Using the Kinect for fun and profit. About /me Tam HANNA –Director, Tamoggemon Holding k,s –Runs web sites about mobile computing –Writes scientific books."

Similar presentations


Ads by Google