VR/AR project Progress Report 2016/07/14. Live Reality Fusion Concatenate the live videos from two or more rooms together. ◦ Observer room + remote room(s)

Slides:



Advertisements
Similar presentations
Joshua Fabian Tyler Young James C. Peyton Jones Garrett M. Clayton Integrating the Microsoft Kinect With Simulink: Real-Time Object Tracking Example (
Advertisements

Stereo Vision System Principle of Operation: Difference between two cameras gives depth information Steps: –Compute disparity image –Find obstacles in.
Y A S O O B A L I b o r n o n 1 9 t h F e b r u a r y i n K a n p u r d i s t r i c t o f U t t a r P r a d e s h. H e s t a r t e d s i n g i.
EVENTS: INRIA Work Review Nov 18 th, Madrid.
The Science of Digital Media Microsoft Surface 7May Metropolia University of Applied Sciences Display Technologies Seminar.
Video Processing EN292 Class Project By Anat Kaspi.
Virtual Reality. What is virtual reality? a way to visualise, manipulate, and interact with a virtual environment visualise the computer generates visual,
Life wall From Panasonic. What is it? A life-size, floor to ceiling, multi-functioning HDTV Phone calls/Conference calls Video games Security watch Decor.
VOICe 1.5 Enabling Technology - Final Project Gabe Su.
CSCE 641 Computer Graphics: Image-based Modeling (Cont.) Jinxiang Chai.
Perceptual versus Cultural. Architecture for visual thinking.
Application Programming Interface For Tracking Face & Eye Motion Team Members Tharaka Roshan Pathberiya Nimesh Saveendra Chamara Susantha Gayan Gunarathne.
The University of Ontario CS 4487/9587 Algorithms for Image Analysis n Web page: Announcements, assignments, code samples/libraries,
Jason Li Jeremy Fowers Ground Target Following for Unmanned Aerial Vehicles.
Introduction to Graphics and Virtual Environments.
Virtual Reality RYAN TAYLOR. Virtual Reality What is Virtual Reality? A Three Dimension Computer Animated world which can be interacted with by a human.
Quick Overview of Robotics and Computer Vision. Computer Vision Agent Environment camera Light ?
EXPLORING THE SUN’S MAGNETIC INFLUENCE April 23, 2015 LT: I can model how certain forces are able to act on an object from a distance Please get a post.
1 DARPA TMR Program Collaborative Mobile Robots for High-Risk Urban Missions Second Quarterly IPR Meeting January 13, 1999 P. I.s: Leonidas J. Guibas and.
Active Display Robot System Using Ubiquitous Network Byung-Ju Yi Hanyang University.
SMUCSE 8394 BTS – Devices II Sensors Detection, Surveillance, Protection.
Gerardo Cabral Jr MIS 304 Professor Fang Fang.  Project Natal” is the code name for a revolutionary new way to play on your Xbox 360.  Natal is pronounced.
University of Maryland Department of Civil & Environmental Engineering By G.L. Chang, M.L. Franz, Y. Liu, Y. Lu & R. Tao BACKGROUND SYSTEM DESIGN DATA.
 Definition  Telepresence vs Video Conference  Advantages of Telepresence  Technologies  Applications  Conclusion.
Autonomous Robot Project Lauren Mitchell Ashley Francis.
ESR 9: Review of test results and plan for the final testing campaign 1/24 EDUSAFE Summer School, 22 nd June 2015 Yuta Itoh (TU Munich)
Advanced Computer Technology II FTV and 3DV KyungHee Univ. Master Course Kim Kyung Yong 10/10/2015.
Image-based rendering Michael F. Cohen Microsoft Research.
Does the Quality of the Computer Graphics Matter When Judging Distances in Visually Immersive Environments? Authors: Thompson, Creem-Regehr, et al. Presenter:
Virtual Reality Lecture2. Some VR Systems & Applications 고려대학교 그래픽스 연구실.
GENESIS OF VIRTUAL REALITY  The term ‘Virtual reality’ (VR) was initially coined by Jaron Lanier, founder of VPL Research (1989)..
Presented by Matthew Cook INFO410 & INFO350 S INFORMATION SCIENCE Paper Discussion: Dynamic 3D Avatar Creation from Hand-held Video Input Paper Discussion:
Tele Immersion. What is Tele Immersion? Tele-immersion is a technology to be implemented with Internet2 that will enable users in different geographic.
Professor : Tsung Fu Chien Student’s name : Nguyen Trong Tuyen Student ID: MA02B208 An application Kinect camera controls Vehicles by Gesture 1 Southern.
R&D  BBC MMX Broadcast-related challenges: Increasing quality and interactivity of audio-visual media Graham Thomas BBC R&D
HCI 입문 Graphics Korea University HCI System 2005 년 2 학기 김 창 헌.
1 Self-Calibration and Neural Network Implementation of Photometric Stereo Yuji IWAHORI, Yumi WATANABE, Robert J. WOODHAM and Akira IWATA.
CONTENT 1. Introduction to Kinect 2. Some Libraries for Kinect 3. Implement 4. Conclusion & Future works 1.
TELE IMMERSION AMAN BABBER
 Motion cameras  Cameras  The motion camera works when an animal is moving or when their in motion. Camera sensor traps offer a sensoring trap thing.
H.E.S.S. - MAGIC – CTA meeting Technical aspects of current & future instrumentation Telescope structures Camera designs Photon detectors Mirrors.
Anaglyph overview stereoscopic viewing technology.
Chapter 2: The Lens. Focal Length is the distance between the center of a lens and the film plane when focused at infinity.
VR Final Project AR Shooting Game
In Video, the shutter remains set Shutter Speed – amount of time the camera shutter is open during the exposure of one frame. (Standard – 1/50 sec.
VIVID Project Attacking plan. Problems Description What we have? – Map(? ) – Satellite Imagery – Aerial Video and Mosaic Images Target – Road, building,
Optical Design, Fabrication and Measurement Associate Professor: Yi-Pai Huang Department of Photonics and Display Institute 2010/02/25.
Wk 9: AR/VR  Immersion  Stereo  Head motion  Sensors  Sensor fusion  Hands-on VR demo.
1 2D TO 3D IMAGE AND VIDEO CONVERSION. INTRODUCTION The goal is to take already existing 2D content, and artificially produce the left and right views.
Microsoft Kinect How does a machine infer body position?
Precise Calibration: Remote Center of Motion Robot Computer Integrated Surgery II - Spring, 2011 Ryan Decker, Changhan Jun, Alex Vacharat, under Professor.
Instantaneous Geo-location of Multiple Targets from Monocular Airborne Video.
Progress Report 07/06 Simon.
VR/AR project Progress Report
CS 4501: Introduction to Computer Vision Augmented and Virtual Reality
IMPART: The Intelligent Mobility Partnership
In Video, the shutter remains set Shutter Speed – amount of time the camera shutter is open during the exposure of one frame. (Standard – 1/50.
Jun Shimamura, Naokazu Yokoya, Haruo Takemura and Kazumasa Yamazawa
Software Design Team KANG Group 1.
Virtual Reality By: brady adger.
Using High-Resolution Visualization over Optical Networks
Translation Rotation reflection Dilation Pre Image Image Rigid Motions
Week 8 Nicholas Baker.
Aaron Swenson Samuel Farnsworth Derek Stewart Craig Call.
Virtual Imaging Peripheral for Enhanced Reality
Virtual Imaging Peripheral for Enhanced Reality
Infix to Postfix Conversion
Progress Report 10/05 Simon.
Progress Report 08/31 Simon.
Chapter 10 Video Technologies Kim Roberts ETEC 562 Spring 2006.
Presentation transcript:

VR/AR project Progress Report 2016/07/14

Live Reality Fusion Concatenate the live videos from two or more rooms together. ◦ Observer room + remote room(s) Remote room Observer Room

Short-term Goal “See Through Wall” ◦ Concatenate R106 and R107 in IIS building. ◦ Static => Dynamic => Real-time

Plan Construct the model of the remote room. ◦ View + depth of items. Determine the distance and angle between observer and target wall in the observer room. Replace target wall in the live video with the view of the remote room.

Stereo Camera ZED stereo camera ◦ 3D Camera for Depth Sensing and Motion Tracking. ◦ ◦

Plan Construct the model of the remote room. ◦ View + depth of items. ◦ Offline construct the 3D model of the remote room using ZED stereo camera. Determine the distance and angle between observer and target wall in the observer room. Replace target wall in the live video with the view of the remote room.

Plan Construct the model of the remote room. ◦ View + depth of items. Determine the distance and angle between observer and target wall in the observer room. ◦ Stereo camera + HMD. Replace target wall in the live video with the view of the remote room.

Stereo Camera + HMD HMDs such as Oculus Rift DK2 do not have camera. Stack ZED stereo camera on top of HMD. ◦ Works as camera, distance meter, and tracker.

Plan Construct the model of the remote room. ◦ View + depth of items. Determine the distance and angle between observer and target wall in the observer room. Replace target wall in the live video with the view of the remote room. ◦ Determine which part of the current image is the wall to be replaced according to the depth map.

Issues Remain How to achieve/implement the previous mentioned functionalities with the stereo camera? Tracking dynamic objects in the remote room. Changes in the remote room. ◦ Light source, new static objects, …etc.

Current Status Waiting for the ZED stereo camera to arrive. APIs and codes studying. Implementing ◦ Wall/Background identification of a given image from the stereo camera.

Discussion