Out of this World A VR Experience brought to you by Team WFF:

Slides:



Advertisements
Similar presentations
K - News By: Elie Ain Malak Nour Assi Vasken Sarkis Georges Wakim.
Advertisements

Facts about Welcome to this video from Ozeki. In this video I will present what makes Ozeki Phone System XE the Worlds best on-site software PBX for Windows.
Kinect is an accessory for the Xbox 360 brings games and entertainment to life in extraordinary new ways with no controller required. Simply step in.
Virtual Reality Design Virtual reality systems are designed to produce in the participant the cognitive effects of feeling immersed in the environment.
Artificial Intelligence in Game Design Intelligent Decision Making and Decision Trees.
Department of Electrical and Computer Engineering He Zhou Hui Zheng William Mai Xiang Guo Advisor: Professor Patrick Kelly ASLLENGE.
Robotics Simulator Intelligent Systems Lab. What is it ? Software framework - Simulating Robotics Algorithms.
Game Development with Kinect
WoundTracker Wound Care Tracking on iPhone. The Problem Wound Care: – Visual tracking of the healing or progressing of a wound by photograph. – Requires.
Next Generation Desktop Environment Project Looking Glass 3D Farrukh Shakil CS /02/06.
Virtual Reality Virtual Reality involves the user entering a 3D world generated by the computer. To be immersed in a 3D VR world requires special hardware.
Hardware Specialised Devices
SAVI Andrew Ebaugh Saurav Chatterjee Shopping assistant for the visually impaired.
Addition and Subtraction of Integers
Research Area B Leif Kobbelt. Communication System Interface Research Area B 2.
Kinect Part II Anna Loparev.
CS426 Game Programming II Dan Fleck. Why games?  While the ideas in this course are demonstrated programming games, they are useful in all parts of computer.
A 2-D, multi-player tank game developed in PLT Scheme ~ ~ ~ Ben VandenBos, Tim Reeves, Justin Hall, and John Ericksen ~ ~ ~ Senior Project - CS496 Spring.
Canyon Adventure Technology David Maung, Tristan Reichardt, Dan Bibyk, Juan Roman Department of Computer Science and Engineering The Ohio State University.
Serious Games for Physical Rehabilitation Designing Highly Configurable and Adaptable Games Department of Electronics and Informatics (ETRO),
SS12 – Project Possibility Android Game Iain FujimotoMorgan Aditya Gandhi Eliud Munguia Ivan Poma Mentor: Artin Fallahi.
$aveZone Milestone 2 - Update $aveZone Milestone 2 - Update Fifth team: Dima Reshidko Oren Gafni Shiko Raboh Harel Cohen.
Advisor: Dr. Edwin Jones 1 Client: Paul Jewell ISU Engineering Distance Learning Facility May01-13 Design Team: David DouglasCprE Matt EngelbartEE Hank.
Of 50 E GOV Universal Access Ahmed Gomaa CIMIC Rutgers University.
SciFest Overview Neil Gannon. Outline Demonstrations using a Microsoft Kinect sensor – Image Manipulation Real-time Invisibility Background removal (green.
Find the Mindstorms Icon on the computer.. To start a new program click go.
Real-Time Cyber Physical Systems Application on MobilityFirst Winlab Summer Internship 2015 Karthikeyan Ganesan, Wuyang Zhang, Zihong Zheng Shantanu Ghosh,
HTML5 Video Player For SharePoint HTML5 Background Why creating video player in HTML5 is easy? Can we do it without Javascript? Easy or Difficult?
ECE 477 Final Presentation Team 1  Spring 2013 Zelun Tie Xin Jin Ranmin Chen Hang Xie.
Team Members Ming-Chun Chang Lungisa Matshoba Steven Preston Supervisors Dr James Gain Dr Patrick Marais.
Stay Kinected: A Home Monitoring System Combining Safety and Comfort Abstract The purpose of this project is to use the Microsoft Kinect sensor to implement.
Department of Electrical and Computer Engineering He Zhou Hui Zheng William Mai Xiang Guo Advisor: Professor Patrick Kelly ASLLENGE Comprehensive Design.
GSU Indoor Navigation Senior Project Fall Semester 2013 Michael W Tucker.
/16 Final Project Report By Facializer Team Final Project Report Eagle, Leo, Bessie, Five, Evan Dan, Kyle, Ben, Caleb.
START Application Spencer Johnson Jonathan Barella Cohner Marker.
CU Student Organizer Trey McAlhany CPSC 482 Mobile Software Development Clemson University April 7, 2015.
Outline  What is MySmartEye ?  Motivation, objectives.  Implementation.  Programming techniques.  Future Work.  Demo.
 1- Definition  2- Helpdesk  3- Asset management  4- Analytics  5- Tools.
KINECT AMERICAN SIGN TRANSLATOR (KAST)
SPiiPlus Training Class
The Future of Drupal and Content Delivery
Face Detection and Notification System Conclusion and Reference
ESign365 Add-In Gives Enterprises and Their Users the Power to Seamlessly Edit and Send Documents for e-Signature Within Office 365 OFFICE 365 APP BUILDER.
A seminar on Touchless Touchscreen Technology
Learn about functions in Alice
Capstone Project, Computer Science Department
A Virtual Reality and Augmented Reality Technology Company
Charlie Kupets / Alex Mueller / Salina Wu
Blind Guidance system (BGS)
AUGMENTED REALITY MULTIPLAYER GAME
CAPTURING OF MOVEMENT DURING MUSIC PERFORMANCE
The Loco-motion Watch Out for Bears Sofia Wyetzner and Geoff Ramseyer
Salevich Alex & Frenkel Eduard Wizard Hunting
Results and Conclusions
Wavestore Integrates… Paxton Net2 Access Control
Wavestore Integrates…
End-of-Term Winter Progress Report
A seminar on Touchless Technology
CS Simulating Mouse Pointing Using Kinect & Smartwatch
IN-14 Leveraging Extensibility Opportunities with the IntelaTrac SDK
Supervisors: Tanja Mitrovic and Moffat Mathews Anthony Bracegirdle
Virtual Reality.
Yooba File Sync: A Microsoft Office 365 Add-In That Syncs Sales Content in SharePoint Online to Yooba’s Sales Performance Management Solution OFFICE 365.
Extended Hologram Project
Make Your Body the Controller
Myo + Oculus Rift Tutorial
Navigation By Touch מנחה הפרוייקט: כפיר לב-ארי.
Northern Motion IT: Ticket Management System
PRELIMINARY DESIGN REVIEW
CS2310 Milestone2 Zihang Huang Project: pose recognition using Kinect
Presentation transcript:

Out of this World A VR Experience brought to you by Team WFF: Emily Reinherz, Nina Lozinski, and Valerie Ding

Goals

Compelling outer space experience – Allow users to feel as though they are truly flying through outer space, with realistic graphics of the solar system Seamless VR motion control through Kinect – Provide smooth movement in VR through using gestures in Kinect, while aiming to minimize lag and latency.

The Development Process

Milestones Week 6 Week 8 Week 10 Week 7 Week 9 Started building gesture database using Kinect and Visual Gesture Builder Built solar system app on Unity, and continued implementing gestures Tested VR movement through Kinect, added and modified gestures Week 7 Week 9 Built simple Unity app, recorded training videos, and tested gesture detection on Unity Implemented networking between PC and Android by sending Kinect data through a socket server

Hardware and Software

Visual Gesture Builder Kinect Studio Visual Gesture Builder “Tag” clips of people moving, identifying which parts of the clips should be recognized as specific gestures Add training videos to database, and Gesture Builder will “learn” to recognize specific gestures Build and export gesture database Record clips of people performing various gestures (leaning forwards and backwards, swaying left and right) in front of the Kinect

Unity Gear VR and Android Build Unity app on Android Plug into Gear VR headset Kinect data is sent from Kinect to PC to Android phone User can move through the solar system in VR using gestures! Gear VR and Android Imports our gesture database Attached scripts grab sensing data from Kinect, compare it to our database and determine if a gesture is being executed Based on which gesture is detected, player object moves around in the solar system app

Networking 2 Unity Apps: Kinect manager and broadcaster (PC side): Gets sensing data from Kinect and detects gesture Sends gesture (ie. “leaning forward with 0.8 progress”) over server Solar system app and player movement (Android/VR side): Receives kinect data sent over server Based on received gesture data, moves player in the app

Design Decisions

Finding Natural Gestures Basic Requirements: Majority of users should be able to use the app Gestures must work equally well in any direction (ie: even if the user is not facing the Kinect) Further Requirements: Gestures should feel natural to the user and translate into movement such that user nausea/dizziness is uncommon Basic: stop/lock gestures Further: fine-tuning speed, etc

Finding Natural Gestures (pt 2) Because of the challenge of having movement work regardless of whether or not a player was facing the Kinect, we had to user test multiple versions of gesture control: Version 1: Lean and Sway Version 2: Multilean Version 3: Multilean + Stop Version 4: Multilean + Lock Version 5: Multilean + Tap

Reducing Network Latency Timing needs to be precise because of planet’s size relative to space Send over as little data as possible between the PC and Android Send over an array of gestures, where each gesture is simply a float: its progress level (ex: how far someone is leaning forward) Have player movement be mostly controlled in Unity Based on gesture detected, a Unity script moves the player correspondingly Less data and lag than using body-tracking and keeping track of joints

Interesting and Fun VR Experience Problem: Having solar system rendered to-scale would be difficult and boring to navigate in VR (moving through space for full minutes before seeing anything) Automating player movement would defeat the purpose of Kinect gestures Solution: Distance between planets is not to scale to make it more enjoyable and easy for users to move through it. However, planet rotation and revolution are to scale.

Demo

Questions?