Jae-Hong Youn1, Byung-Rae Cha2, Eun-Seok Kim3, Yoo-Kang Ji4 Advanced Science and Technology Letters Vol.101 (Art, Culture, Game, Graphics, Broadcasting and Digital Contents 2015), pp.23-26 http://dx.doi.org/10.14257/astl.2015.101.06 Multi modal interface realization for scenario based immersive virtual reality experience simulator Jae-Hong Youn1, Byung-Rae Cha2, Eun-Seok Kim3, Yoo-Kang Ji4 1 DCRC of Dongshin Univ. 185, Geonjae-ro, Naju City, Joennam 520-714, Republic of KOREA, jhyoun@dsu.ac.kr 2School of Information & Communications, GIST. 123, Cheomdan Gwagi-ro, Buk-Gu, Gwangju, 500-712, Republic of KOREA brcha@nm.gist.ac.kr 3Dept. of Digital Contents, Dongshin Univ. 185, Geonjae-ro, Naju City, Joennam 520-714, Republic of KOREA, eskim@dsu.ac.kr 4School of Information & Communications, GIST. 123, Cheomdan Gwagi-ro, Buk-Gu, Gwangju, 500-712, Republic of KOREA Corresponding Author: gistjyk@gist.ac.kr Abstract. Virtual reality technology is widely used in various areas like data visualization, education, medical simulation, etc. However until now most sys- tems have used only limited interfaces, the immersion users can experience also has its limits. This paper designs and implements scenario based immersive vir- tual reality experience simulator using NUI device to improve user immersion. Keywords: Virtual Reality, NUI Device, Sensible Media, MPEG-V, HMD 1 Introduction Virtual reality is the technology which provides reality and immersion through 3D virtual environment computers simulating and widely used in various areas like data visualization, education, medical simulation, etc. Especially in case of environments dangerous like on the fireplace or hard to man-made, virtual reality technology facilitates training repetition over various scenarioes with small cost without exposing to danger. Virtual reality experience based simulator needs technologys like 3D modeling and simulating, real time visualization, interface for user interaction and so on. Nowadays as virtual reality system rapidly develops, various interfaces for users immersion feeling like really existing in the virtual reality environment are actively studied. The research was supported by 'Software Convergence Technology Development Program', through the Ministry of Science, ICT and Future Planning(No. S1004-14-1054) ISSN: 2287-1233 ASTL Copyright © 2015 SERSC
2 Scenario based disaster security simulator Advanced Science and Technology Letters Vol.101 (Art, Culture, Game, Graphics, Broadcasting and Digital Contents 2015) However until now most systems make users interaction with virtual reality system using limited interface thus immersion users experience has limitation. Therefore in order to enhance immersion, the system with extended multi modal interface needs to be built. This paper proposes scenario based immersive virtual reality experience simulator usign NUI device for enhancement of immersion by offering various user interface with simulator. 2 Scenario based disaster security simulator This paper set emergency treatment situation in crowd facilities like theater, exhibi- tion, museum, subway, etc and disaster like fire and earthquake and developed virtual reality technology simulator with which users can master through training. Furthermore actual reappearance effects by the MPEG-V standard are served to enhance educative result and experience of user. Fig. 1. Service diagram of scenario based disaster and security simulator 3 Immersive actual reappearance virtual simulator This research adopted Oculus Rift dk2, Unity 3D Engine, Oculus Rift SDK (Software Development Kit) to make contents magnifying visual immersion of users. Leap Motion is used in order to control contents with user motion and gesture. Also MPEG-V based actual reappearance device is used for maxmizing virtual immersion of users and placing actual reappearance by events. 24 Copyright © 2015 SERSC
Advanced Science and Technology Letters Vol.101 (Art, Culture, Game, Graphics, Broadcasting and Digital Contents 2015) Fig. 2. Immersive Virtual Realistic Experience Simulator System Data Flow & Process Leap motion is made recognize tools like left and right hands and fingers or stick in a free place. Moreover in order to extend the FOV of Leap Motion, Leap Motion VR Developer Mount is adopted right in front of Oculus Rift. Fig. 3. Interaction between virtual space and NilI sensor (a) (b) Fig. 4. Extention of FOV(Field of View) of Leap Motion The proposed system in this paper is designed and implemented to control actual reappearance devices about wind, vibration and heat with contents for improving users’ experience. The script for interworking actual reappearance made use of MPEG-V based SEM format files. Wind and vibration effect comes from the fan and rumble using amBX API. Heaters are controlled real time in the way of developing control board and send-ing MPEG-V format script in the contents using RS232 communication Copyright © 2015 SERSC 25
4 Conclusion References Advanced Science and Technology Letters Vol.101 (Art, Culture, Game, Graphics, Broadcasting and Digital Contents 2015) This system judges seccess and failure event interacting actual reappearance traps and objects in virtual space. Depending on the result and environmental information it reappears the effect and sends real time. 4 Conclusion This research simulated disaster situation in the virtual place using multi modal user interface like gesture recognition, hand movement recognition, head tracking, etc, and proposed immersive actual reappearance virtual simulator for improving users’ im- mersion by connecting the actual reappearance device to the user interaction in the virtual space. Furthermore for the purpose of users’ attaining proficiency of the cop- ing methods in the disaster and safety accidents and emergency situations, scenario based virtual reality experience contents are produced, which are machine of granite direction, fire extinguisher direction, CPR, etc. References Johnsen, K. and Lok, B.: “An evaluation of immersive displays for virtual human experi- ences.” ’08 Virtual Reality Conference. pp.133-136 IEEE Press, Reno, NE (2008). Guo, C. and sharlin, E.: “Exploring the use of tangible user interface for human-robot inter-action” A comparative study. pp.121--130 Proceedings of the SIGCHI Conference on Hu-man Factors in Computing Systems., ACM, New York (2008) Dipietro, L., Sabatini, A.M. and Dario, P.: “A survey of glove-based systems and their applications.” IEEE Trans. Systems, Man and Cybernetics-Part C: Application and Reviews Vol. 38. pp.461-482 (2008) Jung, D.-G., Jang, C.-S., Oh, J-H., Kang, J-G., I-G. and Jung. K.-C.: “An immersive game using a new interface” The Well-Tep. Lecture Notes in Computer Science. Vol.4469. pp63-68. (2007) Kim, HS., T. Di Giacomo, Egges, A., Lyard, E., Garchery, S. and Magnenat-Thalmann, N. “Believable virtual environment: Sensory and perceptual believability.” In Workshop on Believability in Virtual Environment. (2004) Nakhoon Baek and Kwan-Hee Yoo,: “An Automatic Balancing Scheme for Multi-Articulated Virtual Objects”, International Journal of Multimedia and Ubiquitous Engineer-ing, Vol. 8, No. 1 pp.139-150, (2013) Chel-Min Kim, Jae-Hong Youn, Yoo-Kang Ji, and Dong-You Choi.: “Design and Assess-ment of a Virtual Underwater Multisensory Effects Reproducing Simulation System”, In-ternational Journal of Distributed Sensor Networks, Vol.2014, Article ID 420428, pp.1-9, (2014) 26 Copyright © 2015 SERSC