CINEMAcraft: Virtual Minecraft Presence Using OPERAcraft Interactive virtual realities encourage students to pursue and explore STEM fields. CINEMAcraft.

Slides:



Advertisements
Similar presentations
SEMINAR ON VIRTUAL REALITY 25-Mar-17
Advertisements

DOES THE LINEAR SYNERGY HYPOTHESIS GENERALIZE BEYOUND THE SHOULDER AND ELBOW IN MULTI-JOINT REACHING MOVEMENTS? James S. Thomas*, Daniel M Corcos†,, and.
Team Spot Cooperative Light Finding Robots A Robotics Academy Project Louise Flannery, Laurel Hesch, Emily Mower, and Adeline Sutphen Under the direction.
UMR Virtual arm for the Phantom Limb Pain Therapy Eynard L. and Meyer A. and Bouakaz S.
Multiple Frame Motion Inference Using Belief Propagation Jiang Gao Jianbo Shi Presented By: Gilad Kapelushnik Visual Recognition, Spring 2005, Technion.
Digital Interactive Entertainment Dr. Yangsheng Wang Professor of Institute of Automation Chinese Academy of Sciences
Range of Motion. A goniometer is a medical tool that is used to measure range of motion. This range is expressed as angles and listed in degrees.
MUltimo3-D: a Testbed for Multimodel 3-D PC Presenter: Yi Shi & Saul Rodriguez March 14, 2008.
Virtual Reality. What is virtual reality? a way to visualise, manipulate, and interact with a virtual environment visualise the computer generates visual,
Game Development with Kinect
Introduction What is this ? What is this ? This project is a part of a scientific research in machine learning, whose objective is to develop a system,
Abstract In this project we expand our previous work entitled "Design of a Robotic Platform and Algorithms for Adaptive Control of Sensing Parameters".
Symmetry and Reflections
I mage and M edia U nderstanding L aboratory for Performance Evaluation of Vision-based Real-time Motion Capture Naoto Date, Hiromasa Yoshimoto, Daisaku.
TERMINOLOGY OF JOINT MOVEMENT
2.03B Common Types and Interface Devices and Systems of Virtual Reality 2.03 Explore virtual reality.
Electronic Visualization Laboratory University of Illinois at Chicago Interaction between Real and Virtual Humans: Playing Checkers R. Torre, S. Balcisoy.
Human tracking and counting using the KINECT range sensor based on Adaboost and Kalman Filter ISVC 2013.
Abstract Design Considerations and Future Plans In this project we focus on integrating sensors into a small electrical vehicle to enable it to navigate.
Professor : Yih-Ran Sheu Student’s name : Nguyen Van Binh Student ID: MA02B203 Kinect camera 1 Southern Taiwan University Department of Electrical Engineering.
GIP: Computer Graphics & Image Processing 1 1 Medical Image Processing & 3D Modeling.
Mitja Luštrek Jožef Stefan Institute Department of Intelligent Systems.
1 Another Look at Camera Control Karan Singh †, Cindy Grimm, Nisha Sudarsanan Media and Machines Lab Department of Computer Science and Engineering Washington.
INTRODUCTION Generally, after stroke, patient usually has cerebral cortex functional barrier, for example, the impairment in the following capabilities,
Real-Time Animation of Realistic Virtual Humans. 1. The 3D virtual player is controlled by the real people who has a HMD and many sensors people who has.
Project By: Brent Elder, Mike Holovka, Hisham Algadaibi.
Symmetry Figures are identical upon an operation Reflection Mirror Line of symmetry.
GENESIS OF VIRTUAL REALITY  The term ‘Virtual reality’ (VR) was initially coined by Jaron Lanier, founder of VPL Research (1989)..
Joint Movements (from anatomical position unless otherwise stated) Movement Definition FlexionReducing joint angle in sagittal plane (bending elbow) Extension.
1 Perception and VR MONT 104S, Fall 2008 Lecture 21 More Graphics for VR.
Professor : Tsung Fu Chien Student’s name : Nguyen Trong Tuyen Student ID: MA02B208 An application Kinect camera controls Vehicles by Gesture 1 Southern.
Character Setup In addition to rigging for character models, rigging artists are also responsible for setting up animation controls for anything that is.
2.03 Explore virtual reality design and use.
EEC 490 GROUP PRESENTATION: KINECT TASK VALIDATION Scott Kruger Nate Dick Pete Hogrefe James Kulon.
The Language of Movement
Character Design CSIS 5838: Graphics and Animation for Gaming.
12/24/2015 A.Aruna/Assistant professor/IT/SNSCE 1.
Constrained by knowledge Technology development and avatar design Kai-Mikael Jää-Aro Department of Numerical Analysis and Computer Science Royal Institute.
Movement. Flexion Bending or decreasing the angle between two bones. Examples: Elbow -‘bicep curl’ the up phase Knee - bending at the knee Trunk - leaning.
Coordinates and Design. What You Will Learn: To use ordered pairs to plot points on a Cartesian plane To draw designs on a Cartesian plane To identify.
Kinematics Intro :a branch of dynamics that deals with aspects of motion apart from considerations of mass and force Merriam-Webster.
UCL Human Representation in Immersive Space. UCL Human Representation in Immersive Space Body ChatSensing Z X Y Zr YrXr Real–Time Animation.
Angular Kinematics of Human Movement Biomechanics and ergonomics.
What type of transformation is used in inflating a balloon?
Computer Animation Algorithms and Techniques
Southern Taiwan University Department of Electrical Engineering
Date of download: 10/17/2017 Copyright © ASME. All rights reserved.
What Are The Different Types of Joints?
Augmented Reality And Virtual Reality.
Human Body SQ representation
CAPTURING OF MOVEMENT DURING MUSIC PERFORMANCE
3.03 Explore virtual reality design and use.
Natural sciences 2.
OPERAcraft + Kinect = Cinemacraft
The Skeletal System: Joint Movements
Properties of Reflections
A movement of a figure in a plane.
A movement of a figure in a plane.
Natural sciences 1.
Transformations Day 1 Notes Slideshow.
IDRgeneralization: Music Appreciation
Chapter I Introduction
BODY PARTS VOCABULARY. head mouth teeth neck eye.
Rotation: all points in the original figure rotate, or turn, an identical number of degrees around a fixed point.
Project #2 Multimodal Caricatural Mirror Intermediate report
Introduction to transformational GEOMETRY
MY BODY DICTIONARY.
Kinematic Concepts for Analyzing Human Motion
Human Body SQ representation
Presentation transcript:

CINEMAcraft: Virtual Minecraft Presence Using OPERAcraft Interactive virtual realities encourage students to pursue and explore STEM fields. CINEMAcraft engages users by mirroring their movements through a virtual presence in the Minecraft world. The system expands upon the functionality of OPERAcraft, a modification of Minecraft developed at Virginia Tech that allows K-12 students to create opera in real- time within the Minecraft world. CINEMAcraft aims to alter the perspective of how real-time productions will be produced, filmed, and viewed. In addition to reintegrating the system back into OPERAcraft, future development also includes gender-responsive avatars reflecting user genders and the ability to mirror multiple users at once through several avatars. Ico Bukvic, Amira Youssef, Siddharth Narayanan, Brittany Barnes, Elsi Godolja, Marina Kiseleva, Lynn Abbott Department of Computer Science, Department of Electrical and Computer Engineering, SOPA, ICAT. Virginia Tech, Blacksburg, VA O BJECTIVES B ACKGROUND The system utilizes two Microsoft Kinect motion-tracking sensors that infer the location of joints and facial points of the user, as shown in Figure 1. Angles of movement are calculated from the processed spatial data and sent through a middle-ware, Pd-L2Ork (Pure-Data variant of L2Ork developed at Virginia Tech), to the adapted OPERAcraft system. OPERAcraft’s existing avatar manipulation through keyboard input was adapted in order to respond to the real-time continual data retrieval of body and facial movements from the Kinect devices. Legs, arms, torso, shoulders, facial expressions, and jumping status were each monitored and computed separately on the Kinect side, with their corresponding values normalized through Pd- L2Ork before being sent to the OPERAcraft system. The result is a realistic reflection of the user in the form of an avatar in the Minecraft world, as seen in Figure 2. R ESULTS Figure 2: These pairs of images depict the Kinect skeletal representations of a user, and their corresponding avatar mirroring within OPERAcraft. Facial expressions are not pictured. Figure 4: Above are some of the Minecraft facial expression helmets created to represent specific facial expressions detected by the Kinect. KINECT Figure 1: The Kinect devices infers the 3D locations of joints on the body of the user. The avatar’s movement is based on inferred angles of joints from the Kinect spatial skeletal data (note the circled points in Figure 1). 3-D vectors are computed from the X, Y, and Z coordinates of each joint. For arm and leg movement, the angles between corresponding vectors are computed in order to determine the degree of rotation of the limbs, as seen in Figure 3. In order to detect facial expressions, thresholds were set on animation units (AUs), provided by the Kinect SDK, which represent measurements of changes from the average positions of specific facial points. Key measurements included points around the mouth and eyebrows. wrist elbow shoulder torso knee ankle x y z X Rotation Y Rotation Z Rotation Figure 3: There are three main axes of rotations for limbs: around the Z axis (vertically, in the Y direction), around the Y axis (horizontally, in the X direction), and around the X axis (vertically face- on, in the Z direction). Facial Tracking Skeletal Tracking