1 Halden Project VR Workshop, 2-3 March 2005 Development of a Tracking Method for Augmented Reality Applied to Nuclear Plant Maintenance Work Presentation.

Slides:



Advertisements
Similar presentations
Patient information extraction in digitized X-ray imagery Hsien-Huang P. Wu Department of Electrical Engineering, National Yunlin University of Science.
Advertisements

By: Mani Baghaei Fard.  During recent years number of moving vehicles in roads and highways has been considerably increased.
Area and perimeter calculation using super resolution algorithms M. P. Cipolletti – C. A. Delrieux – M. C. Piccolo – G. M. E. Perillo IADO – UNS – CONICET.
QR Code Recognition Based On Image Processing
Road-Sign Detection and Recognition Based on Support Vector Machines Saturnino, Sergio et al. Yunjia Man ECG 782 Dr. Brendan.
Verification of specifications and aptitude for short-range applications of the Kinect v2 depth sensor Cecilia Chen, Cornell University Lewis’ Educational.
For Internal Use Only. © CT T IN EM. All rights reserved. 3D Reconstruction Using Aerial Images A Dense Structure from Motion pipeline Ramakrishna Vedantam.
PlayAnywhere: A Compact Interactive Tabletop Projection-Vision System Professor : Tsai, Lian-Jou Student : Tsai, Yu-Ming PPT Production rate : 100% Date.
Detection and Measurement of Pavement Cracking Bagas Prama Ananta.
By : Adham Suwan Mohammed Zaza Ahmed Mafarjeh. Achieving Security through Kinect using Skeleton Analysis (ASKSA)
Did You See Bob?: Human Localization using Mobile Phones Constandache, et. al. Presentation by: Akie Hashimoto, Ashley Chou.
Generation of Virtual Image from Multiple View Point Image Database Haruki Kawanaka, Nobuaki Sado and Yuji Iwahori Nagoya Institute of Technology, Japan.
A Low-cost Attack on a Microsoft CAPTCHA Yan Qiang,
Intelligent Systems Lab. Extrinsic Self Calibration of a Camera and a 3D Laser Range Finder from Natural Scenes Davide Scaramuzza, Ahad Harati, and Roland.
Localization of Piled Boxes by Means of the Hough Transform Dimitrios Katsoulas Institute for Pattern Recognition and Image Processing University of Freiburg.
Virtual Dart: An Augmented Reality Game on Mobile Device Supervisor: Professor Michael R. Lyu Prepared by: Lai Chung Sum Siu Ho Tung.
Recognition of Traffic Lights in Live Video Streams on Mobile Devices
3D Augmented Reality for MRI-Guided Surgery Using Integral Videography Autostereoscopic Image Overlay Hongen Liao, Takashi Inomata, Ichiro Sakuma and Takeyoshi.
Virtual Dart – An Augmented Reality Game on Mobile Device Supervised by Prof. Michael R. Lyu LYU0604Lai Chung Sum ( )Siu Ho Tung ( )
Iris localization algorithm based on geometrical features of cow eyes Menglu Zhang Institute of Systems Engineering
Automatic Camera Calibration for Image Sequences of a Football Match Flávio Szenberg (PUC-Rio) Paulo Cezar P. Carvalho (IMPA) Marcelo Gattass (PUC-Rio)
A Novel 2D To 3D Image Technique Based On Object- Oriented Conversion.
E.G.M. PetrakisBinary Image Processing1 Binary Image Analysis Segmentation produces homogenous regions –each region has uniform gray-level –each region.
Comparison of LIDAR Derived Data to Traditional Photogrammetric Mapping David Veneziano Dr. Reginald Souleyrette Dr. Shauna Hallmark GIS-T 2002 August.
I mage and M edia U nderstanding L aboratory for Performance Evaluation of Vision-based Real-time Motion Capture Naoto Date, Hiromasa Yoshimoto, Daisaku.
Overview and Mathematics Bjoern Griesbach
Project Objectives o Developing android application which turns standard cellular phone into a tracking device that is capable to estimate the current.
1 REAL-TIME IMAGE PROCESSING APPROACH TO MEASURE TRAFFIC QUEUE PARAMETERS. M. Fathy and M.Y. Siyal Conference 1995: Image Processing And Its Applications.
Augmented Reality and 3D modelling Done by Stafford Joemat Supervised by Mr James Connan and Mr Mehrdad Ghaziasgar.
Convolutional Neural Networks for Image Processing with Applications in Mobile Robotics By, Sruthi Moola.
Presented by: Z.G. Huang May 04, 2011 Did You See Bob? Human Localization using Mobile Phones Romit Roy Choudhury Duke University Durham, NC, USA Ionut.
FEATURE EXTRACTION FOR JAVA CHARACTER RECOGNITION Rudy Adipranata, Liliana, Meiliana Indrawijaya, Gregorius Satia Budhi Informatics Department, Petra Christian.
Satellites in Our Pockets: An Object Positioning System using Smartphones Justin Manweiler, Puneet Jain, Romit Roy Choudhury TsungYun
OCR GCSE ICT DATA CAPTURE METHODS. LESSON OVERVIEW In this lesson you will learn about the various methods of capturing data.
A Method for Modeling of Pedestrian Flow in the Obstacle Space using Laser Range Scanners Yoshitaka NAKAMURA †, Yusuke WADA ‡, Teruo HIGASHINO ‡ & Osamu.
Shape Recognition and Pose Estimation for Mobile Augmented Reality Author : N. Hagbi, J. El-Sana, O. Bergig, and M. Billinghurst Date : Speaker.
Presentation by: K.G.P.Srikanth. CONTENTS  Introduction  Components  Working  Applications.
Takuya Imaeda Graduate School of Media and Governance, Keio University Hideyuki Tokuda Faculty of Environment and Information Studies, Keio University.
1 Recognition of Multi-Fonts Character in Early-Modern Printed Books Chisato Ishikawa(1), Naomi Ashida(1)*, Yurie Enomoto(1), Masami Takata(1), Tsukasa.
3D SLAM for Omni-directional Camera
A 3D Model Alignment and Retrieval System Ding-Yun Chen and Ming Ouhyoung.
A New Fingertip Detection and Tracking Algorithm and Its Application on Writing-in-the-air System The th International Congress on Image and Signal.
BARCODE IDENTIFICATION BY USING WAVELET BASED ENERGY Soundararajan Ezekiel, Gary Greenwood, David Pazzaglia Computer Science Department Indiana University.
Video Eyewear for Augmented Reality Presenter: Manjul Sharma Supervisor: Paul Calder.
Estimation of Sound Source Direction Using Parabolic Reflection Board 2008 RISP International Workshop on Nonlinear Circuits and Signal Processing (NCSP’08)
IRIS RECOGNITION. CONTENTS  1. INTRODUCTION  2. IRIS RECOGNITION  3. HISTORY AND DEVELOPMENT  4. SCIENCE BEHIND THE TECHNOLOGY  5. IMAGE ACQUISITION.
Mobile Image Processing
Zhongyan Liang, Sanyuan Zhang Under review for Journal of Zhejiang University Science C (Computers & Electronics) Publisher: Springer A Credible Tilt License.
Spring 2007 COMP TUI 1 Computer Vision for Tangible User Interfaces.
The Study on the Car Mechanics e-Training AR(Augmented Reality) System for Real-time Augmented Contents Ji-Yean Yoon,1, Dong-Jin Kim 1, Yu-Doo Kim 1 and.
Border Code: an Efficient Code System for Augmented Reality Seong-hun Park and Young-guk Ha' Konkuk University, Department of Computer Science and Engineering,
Augmented Reality and 3D modelling Done by Stafford Joemat Supervised by Mr James Connan.
A Recognition Method of Restricted Hand Shapes in Still Image and Moving Image Hand Shapes in Still Image and Moving Image as a Man-Machine Interface Speaker.
Wonjun Kim and Changick Kim, Member, IEEE
Outline Introduction Related Work System Overview Methodology Experiment Conclusion and Future Work.
AR Based NPP Maintenance M. Khalaquzzaman NIC Lab KAIST 25 Feb 2008.
Application of AR to Water System Isolation Task in NPP Md. Khalaquzzaman 7 th April 2008.
AUGMENTED REALITY VIJAY COLLEGE OF ENGG. FOR WOMEN PRESENTED BY…….
Signal and Image Processing Lab
Recent developments on micro-triangulation
Implementing Localization
Paper – Stephen Se, David Lowe, Jim Little
José Manuel Iñesta José Martínez Sotoca Mateo Buendía
Under Vehicle Surveillance System
SoC and FPGA Oriented High-quality Stereo Vision System
Augmented Reality SDK Introduction
OCR GCSE ICT Data capture methods.
4K Fixed Dome/ Fixed Network Camera
Joshua Kahn, Scott Wiese ECE533 – Fall 2003 December 12, 2003
Scalable light field coding using weighted binary images
Presentation transcript:

1 Halden Project VR Workshop, 2-3 March 2005 Development of a Tracking Method for Augmented Reality Applied to Nuclear Plant Maintenance Work Presentation by Hirotake Ishii Research Associate, Kyoto University, Japan Guest Scientist, Institute for Energy Technology, Norway

2Halden Project VR Workshop, 2-3 March 2005 Background Serious situation of NPP Both of improvement of safety and reduction of cost are required. Introduction of free electricity market. Difficulties of maintenance for aged NPPs. Decrease of expert maintenance workers. Need further development of hardware and software for NPP operation Augmented Reality (AR) is one of the promising technologies that can improve efficiency and safety. Support for maintenance work in plant field There are some rooms to improve its efficiency and safety by introducing state-of-the-art information technologies.

3Halden Project VR Workshop, 2-3 March 2005 What is the AR? Augmented Reality(AR) expands the surrounding real world by superimposing computer-generated information on the user ’ s view. By using Augmented Reality, it becomes possible for the workers to understand various information of the maintenance work intuitively. Destination Superimposed information

4Halden Project VR Workshop, 2-3 March 2005 Key Technology of the AR Tracking Measure the position and rotation of user ’ s view in real time to superimpose virtual object/information at correct position. Many kinds of tracking methods are developed GPS (Differential GPS, Real Time Kinematics GPS) Ultrasonic/Magnetic/Infrared Sensor Inertial Sensor Artificial Marker / Marker-less Hybrid of above (Combination) can not be used in NPP

5Halden Project VR Workshop, 2-3 March 2005 Requirement of Tracking Method applied to NPP Limitations of NPP field from the viewpoint of tracking Indoor Various size of equipment in wide area Lots of metal objects / obstacles / magnetic source Requirement Wide area Accuracy and stability Easy / no installation Less expensive The artificial marker technique has a possibility to be used in a plant.

6Halden Project VR Workshop, 2-3 March 2005 Artificial Marker Technique Calculates the position and rotation of the camera from the position of markers pasted in the environment by image processing technique. Is applied to many AR applications. Strengths Accurate and stable Less expensive High scalability Weaknesses Available only when the distance between the marker and the camera is short (this means many markers need to be pasted in the environment) or the size of the markers is large.

7Halden Project VR Workshop, 2-3 March 2005 Available Distance of Artificial Marker Maximum distance ARToolKit (VGA, f=4mm) 1m : 8 cm 3m : 25 cm 5m : 40 cm Problem There are many small objects like pipes in a plant. The surface of the objects is not flat. It is difficult to paste large markers. It is necessary to make the markers smaller or make it easier to be pasted in a complicated environment

8Halden Project VR Workshop, 2-3 March 2005 Development of a New Tracking System Tracking system using barcode marker Long barcode marker Easy to paste on pipes Tracking system using circular marker Circle shaped marker with code inside Stable recognition in long distance

9 Halden Project VR Workshop, 2-3 March 2005 Tracking System using Barcode Marker by Hiroshi Shimoda, Hirotake Ishii, Masayuki Maeshima, Toshinori Nakai, Zhiqiang Bian and Hidekazu Yoshikawa, (Kyoto University)

10Halden Project VR Workshop, 2-3 March 2005 Design of Barcode Marker ID of marker (7 bits) Humming code (4 bits) 40mm 80mm 40mm 20mm Code”0” Code”1” Gap 11 bits data 7 bits for ID (128 kinds of markers) 4 bits for error correction (Humming code) Length From 640 mm to 1080 mm

11Halden Project VR Workshop, 2-3 March 2005 Image of Tracking Method with Barcode Marker Field Worker Barcode Marker Check the crack of the upper pipe. Worker’s view Instruction information by AR Small Camera HMD Barcode markers are pasted on pipes 3 position (both edge and middle point) of each marker are registered in advance Position and rotation of workers are calculated by using 2 barcode markers (Solving P6P problem)

12Halden Project VR Workshop, 2-3 March 2005 Marker Recognition Algorithm (1) Captured Image:

13Halden Project VR Workshop, 2-3 March 2005 Marker Recognition Algorithm (2) Binarization: Binarize the captured image with the camera at preset threshold level.

14Halden Project VR Workshop, 2-3 March 2005 Marker Recognition Algorithm (3) Labelling: Collect the connected pixels and mark a unique label on the connected part.

15Halden Project VR Workshop, 2-3 March 2005 Marker Recognition Algorithm (4) Narrowing search area: Exclude the parts which have no possibility as the part of the marker by its area and shape.

16Halden Project VR Workshop, 2-3 March 2005 Marker Recognition Algorithm (5) Extraction parts of marker: Pick up the 11 parts which are arranged in a line as a candidate of barcode marker.

17Halden Project VR Workshop, 2-3 March 2005 Marker Recognition Algorithm (6) Decision of code: Decide the code of barcode marker from the area of each part

18Halden Project VR Workshop, 2-3 March 2005 Marker Recognition Algorithm (7) Comparison with pre-registered barcode marker: Correct the code of the marker with Humming code part and compare it with pre-registered barcode marker.

19Halden Project VR Workshop, 2-3 March 2005 Marker Recognition Algorithm (8) Calculate position and rotation of camera Extract 3 points from 2 barcode markers and solve P6P problem

20Halden Project VR Workshop, 2-3 March 2005 Basic Evaluation of Marker Recognition Purpose Evaluate basic ability of proposed tracking method Experimental Method Rotation Distance 120 lux Rotation Pipe: 60φ×1100 mm Pipe arrangement: Horizontal / Vertical Rotation: 0 / 20 / 40 / 60 / 80 degree Distance: 1 / 2 / 3 / 4 / 5 / 6 meters Camera

21Halden Project VR Workshop, 2-3 March 2005 Examples of Captured Images Vertical, 0 degree, 6 meters Vertical, 80 degree, 4 meters Camera Resolution : H320xV240

22Halden Project VR Workshop, 2-3 March 2005 Recognition Result Camera resolution: 320 x 240 Viewing angle: 53 x 40 degree Recognition rate: 10 – 30 fps (Pentium Mobile 1.4GHz)

23Halden Project VR Workshop, 2-3 March 2005 Trial Use in Fugen NPP Water Purification Facility 10 barcode markers pasted in the area All marker ID and positions were registered in advance.

24Halden Project VR Workshop, 2-3 March 2005 Example Recognized Markers

25Halden Project VR Workshop, 2-3 March 2005 Result Walked around the area with prototype system frame images were picked up, in which at least one marker was in the view. Recognition rate: 52.8% (47.2% failed) Erroneous recognition rate: 1.8% Cases of erroneous recognition a marker image was captured against the light, a marker was too far from the camera, and the angle of a marker against the camera direction was too large.

26Halden Project VR Workshop, 2-3 March 2005 Example of Erroneous Recognition (Backlight)

27Halden Project VR Workshop, 2-3 March 2005 Conclusion (Barcode Marker) Proposed marker-based tracking method for AR support of NPP maintenance work. Long barcode marker and simple image recognition. Evaluated basic ability of proposed tracking method in a laboratory. Long distance, enough fast and feasible. Trial use in Fugen NPP Recognition rate: 52.8%, erroneous recognition rate: 1.8%

28 Halden Project VR Workshop, 2-3 March 2005 Tracking System using Circular Marker by Hirotake Ishii, Hidenori Fujino (Kyoto University) Asgeir Droivoldsmo (Institute for Energy Technology)

29Halden Project VR Workshop, 2-3 March 2005 Weakness of Square Markers How to recognize square markers HIRO 1. Detect edges 2. Detect 4 lines 3. Calculate intersections 4. Calculate position and rotation Before binarize After binarize ●● ●●●●● ●● ●●●●● ● ● ● ● ●● ● ● ● ● ●●●●● ●●●●● Real line Estimated line Easily affected by jaggy shaped edges Distance between features Max half meters Distance between features is short (Max to size of marker) Accuracy depends on the distance between the features.

30Halden Project VR Workshop, 2-3 March 2005 The center can be recognized accurately in any situation In Case of Circular Marker Well focusedBadly focusedLong distance Distance between features can be long. Distance between features Max several meters The center can be recognized accurately in any situation Triangulation by plural markers can be used.

31Halden Project VR Workshop, 2-3 March 2005 Design of Circular Marker Outer black circle Center white circle Make as simple as possible Middle code area circle which consists of black or white fans Outer black circle and center white circle are used for calculating the threshold to analyze the code area. Diameter of the marker and the number of division of middle circle can be changed according to the situation. Number of division Number of unique markers

32Halden Project VR Workshop, 2-3 March 2005 Marker Recognition Algorithm (1) Captured Image

33Halden Project VR Workshop, 2-3 March 2005 Marker Recognition Algorithm (2) Calculation of logarithm

34Halden Project VR Workshop, 2-3 March 2005 Marker Recognition Algorithm (3) 3x3 Sobel Filter

35Halden Project VR Workshop, 2-3 March 2005 Marker Recognition Algorithm (4) Labeling

36Halden Project VR Workshop, 2-3 March 2005 Marker Recognition Algorithm (5) Eliminate small area

37Halden Project VR Workshop, 2-3 March 2005 Marker Recognition Algorithm (6) Recognize ellipse and eliminate non-ellipse area

38Halden Project VR Workshop, 2-3 March 2005 Marker Recognition Algorithm (7) Recognize ID of marker and eliminate non-marker area Details are in the paper

39Halden Project VR Workshop, 2-3 March 2005 Marker Recognition Algorithm (8) Result

40Halden Project VR Workshop, 2-3 March 2005 Calculate position and rotation of camera Calculate by using both of the solutions from PnP solver and the rough information of each circular marker’s rotation (Details are in the paper) Marker position on the imageMarker shape on the image Accurate but plural solutions Single but unaccurate solution Compare Single and accurate solution PnP solver

41Halden Project VR Workshop, 2-3 March 2005 Developed Software Camera Control library (C++) TCP/IP DLL for Client Marker Design Printed Markers Printer Marker Maker Marker Location (XML) Marker Visualize Core library (C++) TCP/IP library for Server Link Camera Driver CMU Camera Driver PGR IEEE1394 (IIDC) DragonFly FireFly USB, DV Direct Show Under development Tracking Result (Via TCP/IP) Tracking Server OpenGL DirectX JAVA3D JNI Display Software

42Halden Project VR Workshop, 2-3 March 2005 Evaluation of the accuracy and stability (Single marker) Experimental Method One circular marker (diameter is 40mm) was pasted on a small box. The number of the division of middle circle was 8,9,10. Distance was changed from 300cm to 580cm with 20cm step. Angle was changed from 0 degree to 75 degrees with 15 degrees step. For each condition, 100 images were captured and the position and rotation was calculated. The average and the variance were calculated.

43Halden Project VR Workshop, 2-3 March 2005 Example of the image (Distance:560cm, Angle:0 degree) Camera resolution: H1024 x V768 Focal Length: 8mm

44Halden Project VR Workshop, 2-3 March 2005 Results (Maximum distance) Number of Division Angle (deg.) / /560440/520380/440X/ / /560440/520380/440X/X 10520/ /560400/500380/420X/X Table : Maximum distance (diameter is 4cm) Succeeded in all frames(cm) / Failed in some frames(cm) Maximum distance of the circular marker is about 2 times of the square marker such as ARToolKit Camera resolution: H1024 x V768 Focal Length: 8mm

45Halden Project VR Workshop, 2-3 March 2005 Results (single marker, depth) The accuracy of the position is not good.

46Halden Project VR Workshop, 2-3 March 2005 Results (single marker, rotation)

47Halden Project VR Workshop, 2-3 March 2005 Evaluation of the accuracy and stability (plural markers on a helmet) Experimental Method 22 circular markers were pasted on a helmet. The number of the division of middle circle was 8. Distance was changed from 300cm to 560cm with 20cm step. Angle was changed from 0 degree to 180 degrees with 15 degrees step. For each condition, 100 images were captured and the position and rotation was calculated. The average and the variance were calculated.

48Halden Project VR Workshop, 2-3 March 2005 Example of the image (Distance:560cm, Angle:120 degree) Camera resolution: H1024 x V768 Focal Length: 8mm

49Halden Project VR Workshop, 2-3 March 2005 Results (Position) The accuracy is greatly improved Black : plural Markers Red : Single Marker

50Halden Project VR Workshop, 2-3 March 2005 Results (Rotation) Tracked in wide angle. In the case of single marker, maximum angle is about 60 degrees.

51Halden Project VR Workshop, 2-3 March 2005 Application Example : Tracking in the Office

52Halden Project VR Workshop, 2-3 March 2005 Conclusion (Circular marker) New circular marker has been designed Tracking system that recognizes plural circular markers at one time and calculates the position and rotation of the camera by triangular method has been developed. Two experiments have been conducted in order to evaluate the accuracy and stability of the proposed method. It was confirmed that the accuracy can be greatly improved by using plural markers at one time and the distance between the marker and the camera can be long compared to the conventional method.

53Halden Project VR Workshop, 2-3 March 2005 Future Works Barcode Marker Improvement of proposed method (Multi-camera, camera resolution, recognition algorithm, etc.) Circular Marker Apply to real applications. (Visualization of radiation map in NPP, old church) Development of hybrid tracking Developed tracking methods can be used with other tracking methods such as ARToolKit at the same time. Combine marker-less method (short distance), square marker (middle distance), barcode marker and circular marker (long distance)