Results and Conclusions

Slides:



Advertisements
Similar presentations
CONCEPTUAL WEB-BASED FRAMEWORK IN AN INTERACTIVE VIRTUAL ENVIRONMENT FOR DISTANCE LEARNING Amal Oraifige, Graham Oakes, Anthony Felton, David Heesom, Kevin.
Advertisements

XProtect ® Express Integration made easy. With support for up to 48 cameras, XProtect Express is easy and affordable IP video surveillance software with.
BrightAuthor v3.7 software and BrightSign XD v4.7 firmware
Department of Electrical and Computer Engineering He Zhou Hui Zheng William Mai Xiang Guo Advisor: Professor Patrick Kelly ASLLENGE.
Character Setup Character Setup is the process of creating handles and controls for anything that a character animator will need to adjust in order to.
Ambient Displays of User Mood Tony Morelli Department of Computer Science, University of Nevada, Reno Abstract: Determining a user’s mood can be a very.
Integrated Astronaut Control System for EVA Penn State Mars Society RASC-AL 2003.
 Introduction  Devices  Technology – Hardware & Software  Architecture  Applications.
Modularly Adaptable Rover and Integrated Control System Mars Society International Conference 2003 – Eugene, Oregon.
Introduction Kinect for Xbox 360, referred to as Kinect, is developed by Microsoft, used in Xbox 360 video game console and Windows PCs peripheral equipment.
INTRODUCTION Generally, after stroke, patient usually has cerebral cortex functional barrier, for example, the impairment in the following capabilities,
Full Fitness By: Scott Bruno. Description Variety of different exercises Has altered workout plans Can track history of workouts Has calorie counter,
WorkPackage1: A Smart Home for Disabled: Occupational Therapy Perspective Smarthome appliance controlling can be turned into occupational therapy.
240-Current Research Easily Extensible Systems, Octave, Input Formats, SOA.
Joe Cohen Presentation Overview  Project definition and requirements  Solution process and explanation  Methodology.
1 City With a Memory CSE 535: Mobile Computing Andreea Danielescu Andrew McCord Brandon Mechtley Shawn Nikkila.
The Use of “Virtual Field Trips” in the Electronic Classroom by D. E. Longer & Randy Gottfried.
Vizard Virtual Reality Toolkits Vizard Virtual Reality Toolkits.
Team Members Ming-Chun Chang Lungisa Matshoba Steven Preston Supervisors Dr James Gain Dr Patrick Marais.
Multimedia.
Cevgroup.org C utting E dge V isionaries. cevgroup.org TODAY’s TALK 1) Internet Of Things (IoT) 2) Wi-Fi Controlled Robots 3) Augmented Reality.
Advantages & Disadvantages
Made By: Pallavi Chhikara
-BY SAMPATH SAGAR( ) ABHISHEK ANAND( )
Electrical and Computer Engineering Preliminary Design Review Team 22: Driver Assist.
TECHNOLOGY RESOURCE MANAGEMENT PORTAL Casey Spires Eastern Kentucky University.
CHAPTER ELEVEN MANAGING KNOWLEGE. Objectives We have been through some of this already Role of knowledge management Types of knowledge management systems.
A Quick Tour of Ceedo Deliver environments to un-managed PCs.
 1- Definition  2- Helpdesk  3- Asset management  4- Analytics  5- Tools.
Case Studies.
KINECT AMERICAN SIGN TRANSLATOR (KAST)
Face Detection and Notification System Conclusion and Reference
What Does it Look Like in Grades Kindergarten-2nd ?
Discovering Computers 2010: Living in a Digital World Chapter 14
Capstone Project, Computer Science Department
Mobile Application Development
Gabe Cano, Altarum Institute  Mark Perry, Altarum Institute 
A Virtual Reality and Augmented Reality Technology Company
Charlie Kupets / Alex Mueller / Salina Wu
Ubiquitous Computing and Augmented Realities
Google Cardboard and VR
Project Overview Introduction to Factory Automation Numerical Control
AUGMENTED REALITY MULTIPLAYER GAME
Requirements & Verification
GESTURE CONTROLLED ROBOTIC ARM
PHED 3 Exercise Physiology Angular Momentum
CAPTURING OF MOVEMENT DURING MUSIC PERFORMANCE
Using Earth SySTEM and GLOBE
Datalogging with video
Video and Sensor Network Architecture and Displays
Advantages And Disadvantages
Campus Locator – Definition Phase (May04-04)
Proven technologies, successfully servicing thousands of labs all over the world. 3shape offers the market's most flexible range of technology-leading.
The Loco-motion Watch Out for Bears Sofia Wyetzner and Geoff Ramseyer
NBKeyboard: An Arm-based Word-gesture keyboard
Salevich Alex & Frenkel Eduard Wizard Hunting
Out of this World A VR Experience brought to you by Team WFF:
Sumit Banerjee, Cody Hanson, Panya Gupta
CS Simulating Mouse Pointing Using Kinect & Smartwatch
Virtual Reality.
Nasr Maswood and Long Pham
Panya Gupta | Sumit Banerjee | Cody Hanson
Myo + Oculus Rift Tutorial
Beyond the Book Tech Comms for the Digital Generation (and beyond)
ICT In Education.
Alex Lee, Jeremy Marcelo, Melinda Wang, Sarina Wu
Preliminary Design Review Tablo Game Projection System
Schematic diagram showing inputs and modules of iDROP software.
MyoHMI Architecture Background
Presentation transcript:

Results and Conclusions Out of this World Team Wombat Flombat Flaseys: 24in X 36in Emily Reinherz, Nina Lozinski, Valerie Ding Abstract Methods (continued) Using Unity 5.5, we created a simple app that, using the imported Visual Gesture Builder database, would detect sensing data from Kinect and determine which gesture, if any, was being executed. Throughout the course of the project, we implemented several gestures, including leaning left and right, leaning forwards and backwards, holding your arm out with the hand open (stop gesture), and holding your arm out with your hand in a fist (lock gesture). Based off of two scripts we found online, we created a simple socket server and client. After the first Unity app detected the gesture, the gesture name and progress level (ie. How far the user was leaning forward) was sent over the server to our second Unity app. The amount of data we sent over the server was very small (a string and a float), which helped minimize network lag. The second Unity app was the actual solar system app (which we built with the help of a Solar System package from the asset store), with the main camera as a child of a ”player” object (figure 3). We attached a Player Movement script attached to the player, and based on the received gesture data, the script would move the player in the solar system accordingly. The solar system app was opened on Android and inserted into the VR headset, allowing the user to move through space in virtual reality using gestures and movements. Existing virtual reality products related to outer space offer virtual reality planetarium experience, in which users can stand on Earth and observe constellations. Other apps allow users to be transported to outer space, and move through space through touchpad input. However, there are no current VR applications that actually transport the user into space and allow them to move around within it. Our project incorporates Gear VR with Kinect, allowing users to move through the solar system in virtual reality using body gestures. We used the Kinect v2 Sensor, Kinect Studio and Visual Gesture Builder to detect the user’s movement and gestures. The Kinect sensing data was sent over to a simple Unity broadcaster app opened on the PC. The data was then sent over a simple socket server to the Android solar system app, allowing the user to move through the solar system in virtual reality with body gestures (figure 1). Figure 1 Objectives Our first goal in creating “Out of this World” was to offer a realistic experience of what it is like to move through our solar system. Current options to explore and experience the solar system are extremely limited. Space travel is extremely expensive and largely inaccessible to the general public, so there is a huge demand for everyday people to be able to experience what it might be like to travel through space, for no particular reason other than the novelty and excitement of it. Beyond the novelty of it, a successful virtual simulation of space could serve as a powerful educational tool. In today’s classrooms, students learn about outer space and the solar system through dry lectures and traditional classroom techniques. Giving the students the opportunity to experience the scale, vastness, and otherworldliness of outer space might help pique their interest and inspire them to become the astronauts and scientists of tomorrow. Secondly, we aimed to develop precise and intuitive physical controls that allow a player to realistically control his or her movement in a virtual reality environment. One of the primary challenges of virtual reality right now is the inability of a user to easily interact with his or her virtual environment. Current information input options for Gear VR are limited to the trackpad, so to allow a player to fully interact and move in the virtual reality environment, we incorporated the Kinect v2 sensor, which allowed the user’s movement and gestures to be incorporated into the Gear VR application. One challenge we faced was network lag between the Kinect sensor, PC, and Android phone/Gear VR, as we hoped to provide a seamless and natural experience of motion. A second challenge was incorporating natural gestures that would allow users to control their motion in virtual reality easily and with high precision. Figure 2 Figure 3 Results and Conclusions Our final product was a fully working virtual reality application that allowed users to move through the solar system through body gestures. By leaning forward, the user would move forward in the solar system. Gesture control accounted for speed and acceleration, as the further the user leaned forward, the faster he or she would move in virtual reality. The user could do a full 360 rotation simply by turning around, and then move in that direction by leaning forward again. To “lock” his or her speed, the user could tap the VR headset trackpad, and then stand up, and continue moving forward at a constant speed without having to be leaned forward. To stop moving in virtual reality, the user would outstretch his or her arm, and hold open his or her palm (the “stop” gesture). Additionally, the player’s movement was smooth with minimal lag, and gesture control was precise. Methods We used Kinect v2, Kinect Studio, and Visual Gesture Builder to record and detect a user’s gestures. When the Kinect was connected to our PC laptop, sensing data about the user’s body position, skeleton, joints, and movement was sent to Kinect Studio. In Kinect Studio, we recorded video clips (or, solutions) of subjects performing various gestures (ie. leaning forward, or holding their hand out in a fist). We then uploaded the solution clip into Visual Gesture Builder, and tagged the clip for gestures, essentially marking for the system the parts of the clip where gestures were being performed. We then recorded more clips (training videos) of subjects performing different motions and gestures, including the ones tagged in the solution clip. These training videos were uploaded to Gesture Builder, and based on the marked sections of the solution clip, the system went through the training videos to detect if and where the gesture in question was being performed. Adding more training videos increased the system’s accuracy and detecting and identifying gestures (figure 2). Ultimately, the various clips and the system’s decision algorithm were built into a gesture database, which could be exported and used by an outside application. References "Making Your Body the Controller - Kinect Tutorial for Unity." Channel 9. Channel 9, n.d. Web. 15 Mar. 2017. "How Do You Connect a Gear VR to Kinect?" Quora. Quora, n.d. Web. Unity Asset Store, “Solar System” package, Adam Bielecki. Acknowledgements This project was completed as part of CS 234/334 Mobile Computing (Winter 2017), taught by Prof. Andrew A Chien with TA support by Gushu Li and Ryan Wu. We gratefully acknowledge the generous support of Samsung in providing GearVR equipment.