Download presentation
Presentation is loading. Please wait.
Published byVictoria Johnston Modified over 8 years ago
1
Application of AR to Water System Isolation Task in NPP Md. Khalaquzzaman 7 th April 2008
2
Outlines Introduction Description of Water isolation system (WIS) task Application of AR to the WIS Maintenance support system using AR Tracking methods for AR applications Designing of tracking method Evaluation of tracking method Conclusion and future work
3
The environment where the water system isolation task is conducted has following characteristics: Characteristic of water system isolation task there is huge number of valves and they are distributed all over the plant there are a lot of metal obstacles in the working environment it is difficult to place a lot of new apparatuses because lack of space there are many types of equipment in the plant and working environment is very complicated The limitations concerning to the workers who conduct the water system isolation task as follows: a worker must be equipped with a helmet a worker cannot have equipment that restricts his movement
4
Water system isolation task in NPP In periodic maintenance of NPP, some water systems need to be isolated by valve operation in order to disassemble and check plant equipments. - there are more than 30000 valves in NPP. - Although some of them are controlled remotely, most of them are operated manually - needs a pair of workers to perform the task Manual water system isolation task is performed as follows: Step 1 : supplied a paper-based instruction sheet at a central control room, which describes information about valve operation such as procedure of work, IDs of valves. Step 2: walk to the valve location according to the instruction sheet. Step 3: identify a specified valve and confirm its status Step 4: operate the valve and mark the work list Step 5: walk to the next location(repeat from step-2 to 4) Step 6: input the performed valve operations at the central control room
5
Application of Augmented Reality System Display –Head mounted –Handheld Viewpoint Tracking –Magnetic, Ultrasonic, Computer Vision … Image Generation AR system consists of following major components
6
Comparison of Tracking Methods
7
Development of a tracking method for the maintenance support system Hybrid tracking method consists of vision sensor and inertia sensors Vision sensors: Artificial marker method : a software is employed for calculating the relative position and orientation of a camera against artificial markers. Natural feature method : in natural feature method, the algorithm recognizes object’s corners and lines.
8
Artificial Markers Visual code markers have evolved as a bridge connecting the physical world to the cyber- world, as a tool that contributes to stronger human-computer interaction and ubiquitous computing The visual code marker is comprised of three salient parts, which are the fixed guide bars the fixed corner elements, and the data area, as illustrated in Figure 1.
9
- The camera captures video of the real world and sends it to the computer. - Software on the computer searches through each video frame for any square shapes. - If a square is found, the software calculate the position of the camera relative to the black square. - Once the position of the camera is known a computer graphics model is drawn from that same position. - This model is drawn on top of the video of the real world and so appears stuck on the square marker. - The final output is shown back in the handheld display, so when the user looks through the display they see graphics overlaid on the real world. Algorithm for marker based tracking
10
(a) Original Image b) Binarization c) Fiducial edge detection (a) Fiducial corner detection Algorithm for marker based tracking The outer black band of marker allows for location of a candidate fiducial in a captured image and the interior image allows for identification of the candidate from a set of expected images. Image binarization: the program uses an adaptive threshold to binarize the video image (figure 3-b). Binary images contain only the important information, and can be processed very rapidly. The four corners of the located fiducial allow for the unambiguous determination of the position and orientation of the fiducial relative to a calibrated camera. Furthermore, in order to estimate location of a moving camera in the world coordinate system, Fiducials are placed in the fixed, physical environment Connected Regions Analysis: the system looks up connected regions of black pixels (figure 3-c) and only select the quadrilateral ones. These regions become candidates for the square marker. For each candidate found, the system segregates the contour chains (figure 3-d) into the four sides of the proposed marker, and fits a straight line to each. Finally, the coordinates of the four corners are found by intersecting these lines (figure 3-e) and are stored for the next processes.
11
Marker extraction: The portions of the image that contain the visual code markers are detected and are isolated from the rest of the image. Marker characterization: The parameters that characterize the stance of the visual code marker within the original image are extracted at this stage. These parameters are the rotation angle of the code marker and the position of its four corners. Marker analysis: Given the rotation angle and the location of the four corners, the location of the code marker in the original image can be found and the binary data within can be extracted. Algorithm for marker based tracking
12
Position and Pose Estimation of Markers
13
2D Geometrical Transformations Translate Rotate Scale
14
Transformations Translate: P’ = P+T Scale: P’ = SP Rotate: P’ = RP Translate: P’ = P+T Scale: P’ = SP Rotate: P’ = RP P(x,y) P’(x’,y’) dxdx dydy In matrix format: If we define the translation matrix, then we have P’ =P + T.
15
Homogeneous Coordinates For a given 2D coordinates (x, y), a third dimension is introduced: [x, y, 1] In general, a homogeneous coordinates for a 2D point has the form: [x, y, W] Two homogeneous coordinates [x, y, W] and [x’, y’, W’] are said to be of the same (or equivalent) if x = kx’eg: [2, 3, 6] = [4, 6, 12] y = ky’for some k ≠ 0 where k=2 W = kW’ Therefore any [x, y, W] can be normalised by dividing each element by W: [x/W, y/W, 1]
16
Homogeneous Transformations Now, redefine the translation by using homogeneous coordinates: Similarly, we have: Scaling Rotation P’ = S P P’ = R P
17
Size-known squire markers are used as a base of the coordinates frame the transformation matrices from these marker coordinates to the camera coordinates (T m ) represented in the equation are estimated by image analysis. Estimation of Transformation Matrix - V3x3 is rotation component in the transformation matrix - the four vertices coordinates of the marker in the marker coordinate frame and those coordinates in the camera screen coordinate frame, eight equations including translation component Wx Wy Wz are generated and the value of these translation component Wx Wy Wz can be obtained from these equations.
18
Tracking Method Employing Inertial Sensors o Orientation of the worker is measured by gyro sensors o acceleration of gravity is eliminated based on the measured orientation, and the position variation of the worker is measured by acceleration sensors o the drift error of gyro sensor and acceleration sensor come from the fluctuation of the environmental factors such as temperature
19
Flow chart of Hybrid Tracking -The position and orientation are measured in hybrid method by inertial sensor and artificial marker - if the vision sensor can recognize at least one marker, the output from artificial marker is used, because marker method is more accurate and reliable - in the case of natural feature method, three dimensional positions are calculated - in case both artificial markers and natural features cannot be recognized, the output from the inertial sensors is used.
20
Evaluation of the tracking method employing inertial sensors The authors have conducted two experiments in order to evaluate the effect of the proposed drift cancel method Gyro sensors: Gyro sensors was mounted on a tripod symmetrically and the tripod was rotated by hand. The output from the sensor was recorded by an oscilloscope separately. The output from the sensor was recorded for 5 seconds with 10msec interval and the sensors were rotated from 1.6 seconds to 3.4 seconds. The Figure shows the output from a gyro sensor. During the period of 0 - 1.6 seconds and 3.4 – 5.0 seconds, the sensors were steady state so that the output from the sensors should be constant. However, because of the drift error, the output decreased with time passed.
21
Evaluation of the tracking method employing inertial sensors Acceleration sensors: Two acceleration sensors (C and D) were packed into a sensor unit symmetrically and the sensor unit was moved along with a straight gage by hand. The output from the sensors was recorded with a digital oscilloscope separately. The output from the sensors was recorded for 2 seconds with 10msec interval and the sensors traveled from 0.3 seconds to 1.2 seconds. In this experiment, it can be confirmed that the proposed drift cancel method can cancel the sensor drift.
22
Conclusions In the study, the existing tracking methods employing vision sensor and inertial sensors have been improved and combined into one tracking method. And some experiments were conducted to evaluate each improved method. Concerning to the natural feature method, it was confirmed that the calculation of the position and orientation of the camera by using 6 natural features can be conducted well and the new method can select reliable natural features. The artificial marker method is rather accurate and stable and is applied in several applications. It is, however, necessary that the artificial markers are pasted in advance and if the work area is large, a huge number of artificial markers have to be pasted. However, the number of the natural features that the proposed method can select is too little so that it cannot be applied to the maintenance support system yet. On the other hand, concerning to the inertial sensor method, the proposed drift cancel method worked well to improve the accuracy of the inertial sensors.
23
References 1.Hirokazu Kato and Mark Billinghurst, “Marker Tracking and HMD Calibration for a Video-based Augmented Reality Conferencing System” Augmented Reality, 1999. (IWAR '99) Proceedings. 2nd IEEE and ACM International Workshop onAugmented Reality, 1999. (IWAR '99) Proceedings. 2nd IEEE and ACM International Workshop on 2.Ahmet Altay and Emre Oto, “An Algorithm for Visual Code Marker Extraction and Processing”,at http://www.stanford.edu/class/ee368/Project_06/Project/ee368_reports/ee368group06.pdf, http://www.stanford.edu/class/ee368/Project_06/Project/ee368_reports/ee368group06.pdf 3.Fakhr-eddine Ababsa, and Malik Mallem, “Robust Camera Pose Estimation Using 2D Fiducials Tracking for Real-Time Augmented Reality Systems”. Proceedings of the 2004 ACM SIGGRAPH international conference on Virtual Reality continuum and its applications in industry. 4.Hirotake Ishii, Koji Matsui, Misa Kawauchi, Hiroshi Shimoda and Hidekazu Yoshikawa, “Development of an Augmented Reality System for Plant Maintenance Support” at http://hydro.energy.kyoto-u.ac.jp/Lab/staff/hirotake/paper/papers/CSEPC04.pdf http://hydro.energy.kyoto-u.ac.jp/Lab/staff/hirotake/paper/papers/CSEPC04.pdf
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.