Yingcai Xiao Interactive Visualization with NUI and Game Engines.

Slides:



Advertisements
Similar presentations
SEMINAR ON VIRTUAL REALITY 25-Mar-17
Advertisements

A Natural Interactive Game By Zak Wilson. Background This project was my second year group project at University and I have chosen it to present as it.
Davide Spano CNR-ISTI, HIIS Laboratory, Via G. Moruzzi Pisa, Italy.
Joshua Fabian Tyler Young James C. Peyton Jones Garrett M. Clayton Integrating the Microsoft Kinect With Simulink: Real-Time Object Tracking Example (
Video Game Design Lesson 1. Game Designer Person involved in the development of a video game Person involved in the development of a video game Usually.
KINECT REHABILITATION
By : Adham Suwan Mohammed Zaza Ahmed Mafarjeh. Achieving Security through Kinect using Skeleton Analysis (ASKSA)
SM1205 Interactivity Topic 01: Introduction Spring 2012SCM-CityU1.
Yingcai Xiao Game Development with Unity3D Inside/Outside Unity3D.
Game Development with Kinect
SM1205 Interactivity Topic 01: Introduction Spring 2011SCM-CityU1.
Interactive Sand Art Draw Using RGB-D Sensor Presenter : Senhua Chang.
Input and Interaction Dr. Yingcai Xiao. A good user interface allows users to perform interaction tasks with ease and joy. WYSIWYG (What you see is what.
Unity 3D game IDE 1.  Unity is a multi-platform, integrated IDE for scripting games, and working with 3D virtual worlds  Including:  Game engine ▪
AGD: 5. Game Arch.1 Objective o to discuss some of the main game architecture elements, rendering, and the game loop Animation and Games Development.
UFCFY5-30-1Multimedia Studio 3D Modelling & Animation.
Chapter 1: Voilà! Meet the Android. Smartphones –Can browse the Web –Allow you to play games –Use business applications –Check –Play music –Record.
Kinect Part II Anna Loparev.
Professor : Yih-Ran Sheu Student’s name : Nguyen Van Binh Student ID: MA02B203 Kinect camera 1 Southern Taiwan University Department of Electrical Engineering.
Introduction Kinect for Xbox 360, referred to as Kinect, is developed by Microsoft, used in Xbox 360 video game console and Windows PCs peripheral equipment.
Zhonghua Qu and Ovidiu Daescu December 24, 2009 University of Texas at Dallas.
CHAPTER FOUR COMPUTER SOFTWARE.
INTRODUCTION Generally, after stroke, patient usually has cerebral cortex functional barrier, for example, the impairment in the following capabilities,
A Method for Hand Gesture Recognition Jaya Shukla Department of Computer Science Shiv Nadar University Gautam Budh Nagar, India Ashutosh Dwivedi.
10/9/20151 Unreal Basics CIS 488/588 Bruce R. Maxim UM-Dearborn.
Yingcai Xiao Voxel Game Engine Development. What do we need? What tools do we have? How can we design and implement? We will answer those questions in.
Project By: Brent Elder, Mike Holovka, Hisham Algadaibi.
CHAPTER TEN AUTHORING.
Project By: Brent Elder, Mike Holovka, Hisham Algadaibi.
Web Games Programming An Introduction to Unity 3D.
Yingcai Xiao Game Development Interactive Animation.
OpenNI-Reading and Processing Depth Data Author: 鄭暐達.
卓越發展延續計畫分項三 User-Centric Interactive Media ~ 主 持 人 : 傅立成 共同主持人 : 李琳山,歐陽明,洪一平, 陳祝嵩 水美溫泉會館研討會
Yingcai Xiao Game Development Animation. Video Game Interactive animation: user-> interface (look) -> action (feel) -> feedback (A/V, haptic)
A Multi-agent Approach for the Integration of the Graphical and Intelligent Components of a Virtual Environment Rui Prada INESC-ID.
Professor : Tsung Fu Chien Student’s name : Nguyen Trong Tuyen Student ID: MA02B208 An application Kinect camera controls Vehicles by Gesture 1 Southern.
Yingcai Xiao Game Development with Unity3D. Outline IDE Engine Assets Tutorial Examples Inside.
Wiimote/Kinect Lab Midterm Update Senior Design December 2011, Group 16 Adviser: Dr. Tom Daniels Brenton Hankins Rick Hanton Harsh Goel Jeff Kramer.
Introduction to Interactive Media Interactive Media Tools: Authoring Applications.
CONTENT 1. Introduction to Kinect 2. Some Libraries for Kinect 3. Implement 4. Conclusion & Future works 1.
Vizard Virtual Reality Toolkits Vizard Virtual Reality Toolkits.
Interactive Computer Graphics
Chapter 5 Introduction To Form Builder. Lesson A Objectives  Display Forms Builder forms in a Web browser  Use a data block form to view, insert, update,
Product: Microsoft Kinect Team I Alex Styborski Brandon Sayre Brandon Rouhier Section 2B.
Chapter – 8 Software Tools.
UFCFSU-30-13D Technologies for the Web An Introduction to Unity 3D.
VR software and platform Dr. Nan WANG Dr. Ronan BOULIC EPFL Immersive Interaction Group.
Yingcai Xiao Game Development with Unity3D Inside/Outside Unity3D.
Procedural Animation and Physics Engine Yingcai Xiao.
Game Development with Unity3D
Interactive Animation
Southern Taiwan University Department of Electrical Engineering
Game Development with Unity3D Inside/Outside Unity3D
3GB3 Game Design Unity 3D Basics.
MPEG-4 Binary Information for Scenes (BIFS)
Sai Goud Durgappagari Dr. Yingcai Xiao
Human Computer Interaction
Perspective, Scene Design, and Basic Animation
CAPTURING OF MOVEMENT DURING MUSIC PERFORMANCE
Programming HCI Yingcai Xiao Yingcai Xiao.
Introduction to Game Development
Unit 20 Software Part 2.
A Prime Example of HCI Application
Game Development Animation
Unit 20 Software Part 2.
Chapter I Introduction
Professional Environment
LEAP MOTION: GESTURAL BASED 3D INTERACTIONS
Games Development 2 Tools Programming
Presentation transcript:

Yingcai Xiao Interactive Visualization with NUI and Game Engines

Interactive Visualization Allow the user to control how data are represented as graphics and how the graphics are viewed. Human-Computer Interaction (HCI) is critical to Interactive Visualization (IV).

HCI Three types of HCI: CLI: command line interface (with keyboard) GUI: graphical user interface (mouse) NUI: natural user interface with A/V (Kinect)

NUI: Three parts of NUI: Hardware: e.g., Kinect Software: drivers (OpenNI), middleware Application: integration of HW enabling software with applications.

OpenNI

OpenNI

OpenNI Production Nodes: a set of components that have a productive role in the A/V data creation process required for Natural Interaction based applications. the API of the production nodes only defines the language. The logic of A/V data generation must be implemented by the modules that plug into OpenNI. E.g. for a production node that represents the functionality of generating hand-point data, the logic of hand-point data generation must come from an external middleware component that is both plugged into OpenNI, and also has the knowledge of how to produce such data. Production Nodes: a set of components that have a productive role in the A/V data creation process required for Natural Interaction based applications. the API of the production nodes only defines the language. The logic of A/V data generation must be implemented by the modules that plug into OpenNI. E.g. for a production node that represents the functionality of generating hand-point data, the logic of hand-point data generation must come from an external middleware component that is both plugged into OpenNI, and also has the knowledge of how to produce such data.

OpenNI (1)body imaging (2) joint recognition (3) hand waving

OpenNI: Sensor-Related Production Nodes  Device: represents a physical device (a depth sensor, or an RGB camera). Its main role is to enable device configuration.  Depth Generator: generates a depth-map. Must be implemented by any 3D sensor that wishes to be certified as OpenNI compliant.  Image Generator: generates colored image-maps. Must be implemented by any color sensor that wishes to be certified as OpenNI compliant  IR Generator: generates IR image-maps. Must be implemented by any IR sensor that wishes to be certified as OpenNI compliant.  Audio Generator: generates an audio stream. Must be implemented by any audio device that wishes to be certified as OpenNI compliant.

OpenNI: Middleware-Related Production Nodes  Gestures Alert Generator: Generates callbacks to the application when specific gestures are identified.  Scene Analyzer: Analyzes a scene, including the separation of the foreground from the background, identification of figures in the scene, and detection of the floor plane. The Scene Analyzer’s main output is a labeled depth map, in which each pixel holds a label that states whether it represents a figure, or it is part of the background.  Hand Point Generator: Supports hand detection and tracking. This node generates callbacks that provide alerts when a hand point (meaning, a palm) is detected, and when a hand point currently being tracked, changes its location.  User Generator: Generates a representation of a (full or partial) body in the 3D scene.

OpenNI: Recording Production Notes  Recorder: Implements A/V data recordings  Player: Reads A/V data from a recording and plays it  Codec: Used to compress and decompress data in recordings

OpenNI: Capabilities Supports the registration of multiple middleware components and devices. OpenNI is released with a specific set of capabilities, with the option of adding further capabilities in the future. Each module can declare the capabilities it supports. Currently supported capabilities:  Alternative View: Enables any type of map generator to transform its data to appear as if the sensor is placed in another location.  Cropping: Enables a map generator to output a selected area of the frame.  Frame Sync: Enables two sensors producing frame data (for example, depth and image) to synchronize their frames so that they arrive at the same time.

OpenNI: Capabilities Currently supported capabilities:  Mirror: Enables mirroring of the data produced by a generator.  Pose Detection: Enables a user generator to recognize when the user is posed in a specific position.  Skeleton: Enables a user generator to output the skeletal data of the user. This data includes the location of the skeletal joints, the ability to track skeleton positions and the user calibration capabilities.  User Position: Enables a Depth Generator to optimize the output depth map that is generated for a specific area of the scene.

OpenNI: Capabilities Currently supported capabilities:  Error State: Enables a node to report that it is in "Error" status, meaning that on a practical level, the node may not function properly.  Lock Aware: Enables a node to be locked outside the context boundary.  Hand Touching FOV Edge: Alert when the hand point reaches the boundaries of the field of view.

OpenNI: Generating and Reading Data Production nodes that also produce data are called Generator. Once these are created, they do not immediately start generating data, to enable the application to set the required configuration. The xn::Generator::StartGenerating() function is used to begin generating data. The xn::Generator::StopGenerating stops it. Data Generators "hide" new data internally, until explicitly requested to expose the most updated data to the application, using the UpdateData request function. OpenNI enables the application to wait for new data to be available, and then update it using the xn::Generator::WaitAndUpdateData() function.

Interactive Visualization with a Game Engine

Video Game Interactive animation: user-> interface -> game object action -> feedback (A/V, haptic) Game objects can represent data.

Video Game User Controller Display Game (Software)

Video Game Input Device Driver Display Device Driver (GDI) Display Device Driver (GDI) Game (Software)

Software for Kinect-based game development OpenNI: a general-purpose framework for obtaining data from 3D sensors SensorKinect: the driver for interfacing with the Microsoft Kinect NITE: a skeleton-tracking and gesture-recognition library Unity3D: a game engine ZigFu: Unity Package for Kinect (Assets and Scripts) OpenNI: a general-purpose framework for obtaining data from 3D sensors SensorKinect: the driver for interfacing with the Microsoft Kinect NITE: a skeleton-tracking and gesture-recognition library Unity3D: a game engine ZigFu: Unity Package for Kinect (Assets and Scripts)

Unity3D IDE Assets (Graphics, Scripts) Scripts (Algorithms) Engine Tutorial Examples

Game Assets for Interactive Visualization Game Assets: Game Objects and Animation Scripts Game objects can be real world objects, artistic virtual objects, or data objects. Scripts can be used to make interactive. For interactive visualization, we just create assets. The game engine will take care everything else (including physics).

Game Assets Creators Blender: blender.org Maya: AutoDesk.com 3ds Max: AutoDesk.com MotionBuilder: AutoDesk.com Visualization and Animation at AutoDesk Student Center Jmol: (for visualization of chemical and biological structures) All free for students.

Unity 3D IDE IDE: Integrated Development Environment Project: directory and files for a specific game project. C:\Users\xiao\Documents\New Unity Project 1 \Assets (anything you can reuse) \Library (binary files)

Unity 3D: Assets C:\Users\xiao\Documents\New Unity Project 1\Assets (anything you can reuse) \Standard Assets \OpenNI \Scripts \_Scenes \Materials\Artwork

Unity 3D: Standard Assets C:\Users\xiao\Documents\New Unity Project 1\Assets\Standard Assets Objects: (Look) \Tree\Terrain\Charater Lights: (Look) \Light Flares \Light Cookies Code: (Feel: control, interaction, animation, …) \Scripts

Unity 3D: Objects C:\Users\xiao\Documents\New Unity Project 1\Assets\Standard Assets\Charater: Prefab: (Predefined Objects) First Person, 3 rd Person \Source: \Prototype (Look) Constructor.FBX \Materials (properties) \Textures (images) \Scripts (Feel: actions) Java Scripts: ThirdPersonController.js C#: MouseLook.cs

Unity 3D: Scripts Languages: Interpreted : Java Script Compiled: C# Usages: General: under Project\Scripts ExitOnEscape.cs Objects: attached to objects ThirdPersonController.js ThirdPersonController.js

Unity 3D: Library cashe: for speeding up processing metadata: data that describes data previews: for previewing scenes ScriptAssemblies: compiled object assemblies for scripts

Unity 3D: GUI Start UnityStart Unity File->Create ProjectFile->Create Project Select Assets (Select Assets (Character, Lights, Scripts, Sky, Terrain, Tree) Assets->Import Package->Custom Package UnityOpenNIBindings-v1.4.unitypackage File->New Scene File->Save Scene

Unity 3D: GUI Terrain->Create Terrain Terrain->Set Resolution: width = 300; height = 300; length = 300; GameObject->Create Other->Directional Light Adjust light direction to the terrain

Unity 3D: Game Objects Background Objects: Terrain and Sky Terrain: elevation grid, adjustable height, texture, Sky: texture, static view Add-ons: trees, stones, … Foreground Objects: Objects can be animated.

Unity 3D: Game Objects Rigid Objects: non-deformable with physical properties (gravity, inertial). Non-rigid Objects: Deformable: changeable geometry Breakable: changeable topology. Intangible Objects: No predefined shape. Fire, clouds, …

Summary Interaction NUI Kinect Visualization Game Object Game Engine