Download presentation
Presentation is loading. Please wait.
1
Results and Conclusions
Out of this World Team Wombat Flombat Flaseys: 24in X 36in Emily Reinherz, Nina Lozinski, Valerie Ding Abstract Methods (continued) Using Unity 5.5, we created a simple app that, using the imported Visual Gesture Builder database, would detect sensing data from Kinect and determine which gesture, if any, was being executed. Throughout the course of the project, we implemented several gestures, including leaning left and right, leaning forwards and backwards, holding your arm out with the hand open (stop gesture), and holding your arm out with your hand in a fist (lock gesture). Based off of two scripts we found online, we created a simple socket server and client. After the first Unity app detected the gesture, the gesture name and progress level (ie. How far the user was leaning forward) was sent over the server to our second Unity app. The amount of data we sent over the server was very small (a string and a float), which helped minimize network lag. The second Unity app was the actual solar system app (which we built with the help of a Solar System package from the asset store), with the main camera as a child of a ”player” object (figure 3). We attached a Player Movement script attached to the player, and based on the received gesture data, the script would move the player in the solar system accordingly. The solar system app was opened on Android and inserted into the VR headset, allowing the user to move through space in virtual reality using gestures and movements. Existing virtual reality products related to outer space offer virtual reality planetarium experience, in which users can stand on Earth and observe constellations. Other apps allow users to be transported to outer space, and move through space through touchpad input. However, there are no current VR applications that actually transport the user into space and allow them to move around within it. Our project incorporates Gear VR with Kinect, allowing users to move through the solar system in virtual reality using body gestures. We used the Kinect v2 Sensor, Kinect Studio and Visual Gesture Builder to detect the user’s movement and gestures. The Kinect sensing data was sent over to a simple Unity broadcaster app opened on the PC. The data was then sent over a simple socket server to the Android solar system app, allowing the user to move through the solar system in virtual reality with body gestures (figure 1). Figure 1 Objectives Our first goal in creating “Out of this World” was to offer a realistic experience of what it is like to move through our solar system. Current options to explore and experience the solar system are extremely limited. Space travel is extremely expensive and largely inaccessible to the general public, so there is a huge demand for everyday people to be able to experience what it might be like to travel through space, for no particular reason other than the novelty and excitement of it. Beyond the novelty of it, a successful virtual simulation of space could serve as a powerful educational tool. In today’s classrooms, students learn about outer space and the solar system through dry lectures and traditional classroom techniques. Giving the students the opportunity to experience the scale, vastness, and otherworldliness of outer space might help pique their interest and inspire them to become the astronauts and scientists of tomorrow. Secondly, we aimed to develop precise and intuitive physical controls that allow a player to realistically control his or her movement in a virtual reality environment. One of the primary challenges of virtual reality right now is the inability of a user to easily interact with his or her virtual environment. Current information input options for Gear VR are limited to the trackpad, so to allow a player to fully interact and move in the virtual reality environment, we incorporated the Kinect v2 sensor, which allowed the user’s movement and gestures to be incorporated into the Gear VR application. One challenge we faced was network lag between the Kinect sensor, PC, and Android phone/Gear VR, as we hoped to provide a seamless and natural experience of motion. A second challenge was incorporating natural gestures that would allow users to control their motion in virtual reality easily and with high precision. Figure 2 Figure 3 Results and Conclusions Our final product was a fully working virtual reality application that allowed users to move through the solar system through body gestures. By leaning forward, the user would move forward in the solar system. Gesture control accounted for speed and acceleration, as the further the user leaned forward, the faster he or she would move in virtual reality. The user could do a full 360 rotation simply by turning around, and then move in that direction by leaning forward again. To “lock” his or her speed, the user could tap the VR headset trackpad, and then stand up, and continue moving forward at a constant speed without having to be leaned forward. To stop moving in virtual reality, the user would outstretch his or her arm, and hold open his or her palm (the “stop” gesture). Additionally, the player’s movement was smooth with minimal lag, and gesture control was precise. Methods We used Kinect v2, Kinect Studio, and Visual Gesture Builder to record and detect a user’s gestures. When the Kinect was connected to our PC laptop, sensing data about the user’s body position, skeleton, joints, and movement was sent to Kinect Studio. In Kinect Studio, we recorded video clips (or, solutions) of subjects performing various gestures (ie. leaning forward, or holding their hand out in a fist). We then uploaded the solution clip into Visual Gesture Builder, and tagged the clip for gestures, essentially marking for the system the parts of the clip where gestures were being performed. We then recorded more clips (training videos) of subjects performing different motions and gestures, including the ones tagged in the solution clip. These training videos were uploaded to Gesture Builder, and based on the marked sections of the solution clip, the system went through the training videos to detect if and where the gesture in question was being performed. Adding more training videos increased the system’s accuracy and detecting and identifying gestures (figure 2). Ultimately, the various clips and the system’s decision algorithm were built into a gesture database, which could be exported and used by an outside application. References "Making Your Body the Controller - Kinect Tutorial for Unity." Channel 9. Channel 9, n.d. Web. 15 Mar "How Do You Connect a Gear VR to Kinect?" Quora. Quora, n.d. Web. Unity Asset Store, “Solar System” package, Adam Bielecki. Acknowledgements This project was completed as part of CS 234/334 Mobile Computing (Winter 2017), taught by Prof. Andrew A Chien with TA support by Gushu Li and Ryan Wu. We gratefully acknowledge the generous support of Samsung in providing GearVR equipment.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.