Presentation is loading. Please wait.

Presentation is loading. Please wait.

Perception for Robot Detection 2011/12/08. Robot Detection Better Localization and Tracking No Collisions with others.

Similar presentations


Presentation on theme: "Perception for Robot Detection 2011/12/08. Robot Detection Better Localization and Tracking No Collisions with others."— Presentation transcript:

1 Perception for Robot Detection 2011/12/08

2 Robot Detection Better Localization and Tracking No Collisions with others

3 Goal Robust Robot Detection Long Range Short Range

4 Long Range Current Method : Heuristic Color-Based non-line white segments are clustered. The extracted clusters are classified as Nao robots if the following three criterions are satisfied: the number of segments in the cluster should be larger than 3 the width-to-height ratio should be larger than 0.2 the highest point of the cluster should be close enough to the border line within 10 pixels as the observed robot should intersect with the field border in the camera view if both of the observing and observed robots are standing in the field.

5 Long Range Improvement : Feature-Based Considerations Scale Invariant Affine Invariant Complexity Possible Solutions 1.SIFT(Scale-invariant feature transform) 2.SURF(Speeded Up Robust Features) 3.MSER(Maximally Stable Extremal Regions)

6 Long Range Improvement : Feature-Based Offline Models Online Object Recognition using Predefined Models Feature Detection

7 Short Range Sonar and Vision Two-Stage 1.Sonar 2.Active Vision – Feet Detection(sufficiently large white spot)

8 New NAO Possible Improvement Using two cameras – One for ball, the other for localization – One for feet detection, the other for localization – …… Not Downsampling – 320 x 240 -> 640 x 480

9 References bhuman11_coderelease SIFT(http://www.cs.ubc.ca/~lowe/keypoints/)http://www.cs.ubc.ca/~lowe/keypoints/ SURF Paper : Speeded-Up Robust Features (SURF) MSER Tracking Paper : Efficient Maximally Stable Extremal Region (MSER) Tracking

10 Perception for Robot Detection 2011/12/22

11 COLOR-BASED SUCCESSFUL CASES

12 Front View SURFAffine SIFT 209.318ms 73 -> 2017s 11655 -> 14237

13 Back View SURFASIFT 125.309ms 74->186 6s 11377 -> 13930

14 Side View SURFASIFT 7s 7763 -> 13060 18matches

15 COLOR-BASED FAILED CASES

16 False Alarm SURFASIFT Misclassifications of the field lines.

17 < 100cm front SURFASIFT 92 matches

18 < 100cm back SURFASIFT

19 < 100cm side SURFASIFT 40 matches

20 300cm front SURFASIFT 21 matches

21 300cm side ASIFT

22 350cm front ASIFT

23 Conclusion Performance is not significantly better Processing time is an issue

24 Perception for Robot Detection 2011/1/5

25 ROBOT DETECTION USING ADABOOST WITH SIFT

26 Multi-Class Training Stage : Using Adaboost Classes = (different view point of nao robots) X (different scale of nao robots) X (different illuminations) Input : For each class, training images (I 1, l 1 )…(I n, l n ) where l i = 0, 1 for negative and positive examples, respectively. Output : strong classifier (set of weak classifiers) for each class.

27 Issues In Training Stage Number of Classes : It depends on the limits of SIFT features(angle of view-invariant, range of scale-invariant, degree of illumination- invariant)

28 Detection Stage Input : input image from nao camera Output : 1.Number of robots in the image 2.Classes each robot belongs to => rough distance and facing direction of the detected robot

29 Issues In Training Stage Speed : Using sharing and non-sharing features to speed up. Extracted Features Input Image SIFT Feature Extraction Detection using Sharing Features from Training Stage Detection using Non- sharing Features from Training Stage yes Class 1 Class 2 Class 3 NO

30 References Hand Posture Recognition Using Adaboost with SIFT for Human Robot Interaction Sharing features: efficient boosting procedures for multiclass object detection

31 Aldebaran SDK 2012/3/16

32 NAOqi Framework NAOqi is a process, which is like a module look-up server.

33 Aldebaran Modules Local Modules : 1.It is compiled as a library(xxxx.so), and can only be used on the robot. 2.More efficient than a remote module. 3.Launched in the same process. They speak to each other using only ONE broker. They can share variables and call each others’ methods without serialization nor networking. Remote Modules : 1.it is compiled as an executable file(xxxx), and can be run outside the robot. 2.Less performance in terms of speed and memory usage. 3.Modules communicate each other by using the network.

34 BHuman Lib-bhuman is a Aldebaran module which manages Nao’s hardware-related memory(joints, sensor data).

35 C++ SDK 1.12 Installation Installation Guide: http://www.aldebaran-robotics.com/documentation/dev/cpp/install_guide.html Related Files: On Lab Server: /usr/home/markcsie/AldebaranSDK Requirements: 1. Linux(Ubuntu 10.04) 2. gcc > 4.4 is required. 3. CMake 2.8(Used by qibuild) 4. qibuild-1.12 5. naoqi-sdk-1.12-linux32.tar.gz 6. nao-geode-cross-toolchain-1.12.0.tar.gz (For NAO 3.3) 7. nao-atom-cross-toolchain-1.12.0.tar.gz (For NAO 4.0) 8. nao-flasher-1.12.1.3-linux32.tar.gz (Flasher) 9. opennao-geode-system-image-1.12.gz (OS For NAO 3.3 ) 10. opennao-atom-system-image-1.12.opn (OS For NAO 4.0 ) 10. IDE: QtCreator(optional)

36 Installation 1. Edit ~/.bashrc: export LD_LIBRARY_PATH=[path to sdk]/lib export PATH=${PATH}:~/.local/bin:~/bin 2. $ [path to qibuild]/install-qibuild.sh 3. $ cd [Programming Workspace] $ qibuild init –interactive(choose UNIX Makefiles) 4. $ qitoolchain create [toolchain name] [path to sdk]/toolchain.xml –default

37 Create and Build a Project 1. $ qibuild create [project name] 2. $ qibuild configure [project name] –c [toolchain name] (--release) 3. $ qibuild make [project name] –c [toolchain name] (–release) 4. $ qibuild open [project name] 3. == running Makefile

38 Cross Compile(Local Module) $ qitoolchain create opennao-geode [path to cross toolchain]/toolchain.xml –default $ qibuild configure [project name] –c opennao- geode $ qibuild make [project name] –c opennao- geode

39 Get Image Example /** Create a proxy to ALVideoDevice on the robot.*/ ALVideoDeviceProxy camProxy(robotIp, 9559); /** Subscribe a client image requiring 320*240 and BGR colorspace.*/ const std::string clientName = camProxy.subscribe("test", kQVGA, kBGRColorSpace, 30); /** Create an iplimage header to wrap into an opencv image.*/ IplImage* imgHeader = cvCreateImageHeader(cvSize(320, 240), 8, 3); /** Retrieve an image from the camera. * The image is returned in the form of a container object, with the * following fields: * 0 = width * 1 = height * 2 = number of layers * 3 = colors space index (see alvisiondefinitions.h) * 4 = time stamp (seconds) * 5 = time stamp (micro seconds) * 6 = image buffer (size of width * height * number of layers) */ ALValue img = camProxy.getImageRemote(clientName); /** Access the image buffer (6th field) and assign it to the opencv image * container. */ imgHeader->imageData = (char*)img[6].GetBinary(); There will be a compilation error due to openCV. http://users.aldebaran- robotics.com/index.php?option=com_kunena&Itemid=14&func=view&cati d=68&id=8133

40 NAO OS NAO 3.X : http://www.aldebaran- robotics.com/documentation/software/naoflash er/rescue_nao_v3.html?highlight=flasher NAO 4.0 : http://www.aldebaran- robotics.com/documentation/software/naoflash er/rescue_nao_v4.html

41 Connect to NAO Wired Connection(Windows only): Plug the Ethernet cable, then press the chest button, NAO will speak out his IP address. Connect to NAO using web browser. Wireless Connection: http://www.aldebaran- robotics.com/documentation/nao/nao- connecting.html

42 Software, Documentation and Forum http://users.aldebaran-robotics.com/ Account: nturobotpal Password: xxxxxxxxxx

43


Download ppt "Perception for Robot Detection 2011/12/08. Robot Detection Better Localization and Tracking No Collisions with others."

Similar presentations


Ads by Google