Depth Perception in Medical Augmented Reality

Slides:



Advertisements
Similar presentations
A Natural Interactive Game By Zak Wilson. Background This project was my second year group project at University and I have chosen it to present as it.
Advertisements

Detail to attention: Exploiting Visual Tasks for Selective Rendering Kirsten Cater 1, Alan Chalmers 1 and Greg Ward 2 1 University of Bristol, UK 2 Anyhere.
Psych 101 for Designers Interaction Design. Interaction Design is about people first. What motivates people? How do people think? How do people behave?
Virtual Reality Design Virtual reality systems are designed to produce in the participant the cognitive effects of feeling immersed in the environment.
VisHap: Guangqi Ye, Jason J. Corso, Gregory D. Hager, Allison M. Okamura Presented By: Adelle C. Knight Augmented Reality Combining Haptics and Vision.
Augmented Reality David Johnson.
Probabilistic video stabilization using Kalman filtering and mosaicking.
Graphics. Applications  Digital media  Entertainment  Art  Visualization  Science  Modeling  Games  Software  Virtual Reality.
Head Tracking and Virtual Reality by Benjamin Nielsen.
CS335 Principles of Multimedia Systems Multimedia and Human Computer Interfaces Hao Jiang Computer Science Department Boston College Nov. 20, 2007.
Animation IS 247: Information Visualization and Presentation Saifon Obromsook Linda Harjono.
© 2010 Pearson Addison-Wesley. All rights reserved. Addison Wesley is an imprint of 1-1 HCI Human Computer Interaction Week 10.
Virtual Reality Laparoscopic Port Site Simulator University of Washington Department of Surgery & Human Interface Technology Lab.
Optical flow (motion vector) computation Course: Computer Graphics and Image Processing Semester:Fall 2002 Presenter:Nilesh Ghubade
Introduction Kinect for Xbox 360, referred to as Kinect, is developed by Microsoft, used in Xbox 360 video game console and Windows PCs peripheral equipment.
Multimedia Specification Design and Production 2013 / Semester 2 / week 8 Lecturer: Dr. Nikos Gazepidis
Facial animation retargeting framework using radial basis functions Tamás Umenhoffer, Balázs Tóth Introduction Realistic facial animation16 is a challenging.
A haptic presentation of 3D objects in virtual reality for the visually disabled MSc Marcin Morański Professor Andrzej Materka Institute of Electronics,
Sketch­based interface on a handheld augmented reality system Rhys Moyne Honours Minor Thesis Supervisor: Dr. Christian Sandor.
INTRODUCTION Generally, after stroke, patient usually has cerebral cortex functional barrier, for example, the impairment in the following capabilities,
Virtual Cockpit: Terbu Alexander Institute for Computer Graphics and Vision Graz University of Technology An Alternative Augmented Reality User Interface.
An Information Fusion Approach for Multiview Feature Tracking Esra Ataer-Cansizoglu and Margrit Betke ) Image and.
Virtual Reality Lecture2. Some VR Systems & Applications 고려대학교 그래픽스 연구실.
GENESIS OF VIRTUAL REALITY  The term ‘Virtual reality’ (VR) was initially coined by Jaron Lanier, founder of VPL Research (1989)..
Augmented reality Prepared by: Khyati Kataria 08DCE001
User Performance in Relation to 3D Input Device Design  Studies conducted at University of Toronto  Usability review of 6 degree of freedom (DOF) input.
2006/10/25 1 A Virtual Endoscopy System Author : Author : Anna Vilanova 、 Andreas K ö nig 、 Eduard Gr ö ller Source :Machine Graphics and Vision, 8(3),
Outline Introduction Related Work System Overview Methodology Experiment Conclusion and Future Work.
A Framework for Perceptual Studies in Photorealistic Augmented Reality Martin Knecht 1, Andreas Dünser 2, Christoph Traxler 1, Michael Wimmer 1 and Raphael.
BeNeFri University Vision assistance for people with serious sight problems Moreno Colombo Marin Tomić Future User Interfaces, BeNeFri University.
CIRP Annals - Manufacturing Technology 60 (2011) 1–4 Augmented assembly technologies based on 3D bare-hand interaction S.K. Ong (2)*, Z.B. Wang Mechanical.
Visual Information Processing. Human Perception V.S. Machine Perception  Human perception: pictorial information improvement for human interpretation.
Mixed Reality Augmented Reality (AR) Augmented Virtuality (AV)
Friction Investigating static and kinetic friction of a body on different surfaces.
Crowds (and research in computer animation and games)
Hand Gestures Based Applications
Introducing virtual REALITY
First-person Teleoperation of Humanoid Robots
  A preliminary study: Perceptions of aviation maintenance students related to the use of Augmented Reality maintenance instructions Amadou Anne, Yu Wang.
Diving deeper into design
Introduction to Virtual Environments & Virtual Reality
A Novel 2D-to-3D Conversion System Using Edge Information
Dynamic View Morphing performs view interpolation of dynamic scenes.
Research Methods Dr. X.
Charlie Kupets / Alex Mueller / Salina Wu
Ubiquitous Computing and Augmented Realities
Augmented Reality And Virtual Reality.
RGBD Camera Integration into CamC Computer Integrated Surgery II Spring, 2015 Han Xiao, under the auspices of Professor Nassir Navab, Bernhard Fuerst and.
Virtual Reality By: brady adger.
Requirements & Verification
NBKeyboard: An Arm-based Word-gesture keyboard
Senior Capstone Project Gaze Tracking System
Video-based human motion recognition using 3D mocap data
Common Classification Tasks
Play game, pause video, move cursor… with your eyes
Developing systems with advanced perception, cognition, and interaction capabilities for learning a robotic assembly in one day Dr. Dimitrios Tzovaras.
Crowds (and research in computer animation and games)
Parallel Integration of Video Modules
MSC projects for for CMSC5720(term1), CMSC5721(term2)
Mobile Haptic Display Monthly Meeting
Larry Braile, Purdue University
PERCEPTION is the process of organizing and interpreting sensory information.
Chapter I Introduction
Computer Graphics Lecture 15.
EE 492 ENGINEERING PROJECT
AHED Automatic Human Emotion Detection
Goals Taxonomy Technology
– Graphics and Visualization
Mixed Reality: Are Two Hands Better Than One?
Overview of augmented reality By M ISWARYA. CONTENTS  INTRODUCTION  WHAT IS AR?  ROLE OF AR  EXAMPLES  APPLICATION  ADVANTAGES  CONCLUSION.
Presentation transcript:

Depth Perception in Medical Augmented Reality Computer Aided Medical Procedures Depth Perception in Medical Augmented Reality Fabian Prada, Javad Fotouhi, Sunyan Lee

Problem Statement Intuitive perception of depth Donnerstag, 15. November 2018 Problem Statement Intuitive perception of depth The goal is not to implement different depth cues, but to develop an interactive evaluation/training setup Depth perception using different visualization depth cues: Shadow Fog Reflection Reference Heat The problem here we are trying to address is the need for intuitive perception of depth and compelling graphics for proper integration of AR into clinical routines. Most proposed techniques suffer from poor depth perception. Lack of these depth cues, will show the objects (which are here say the lymph nodes) as floating objects on top of the other anatomical tissues. Goal: The goal here is not to come up with novel depth cues, but to propose a framework which it can be used for both evaluation and training of the depth cues. Our observation which we come to it later is that great graphics solely cannot answer the problem, unless there would be proper training where the user gets comfortable working with. Donnerstag, 15. November 2018

Donnerstag, 15. November 2018 Related Work Three different visualizations with time dependent animations Transparency Fog Glowing This is the work from Felix and Anja where they demonstrated 3 different visualization techniques of transparency, fog, and glowing for 3 different models of the lymph nodes in the case of the breast cancer. For every trial they used different time varying animations to the user. Donnerstag, 15. November 2018

Motivation and limitations of a gaming framework Donnerstag, 15. November 2018 Motivation and limitations of a gaming framework Motivation: Use the framework not just for the sake of evaluation, but for training purposes as well. Limitations: Additional information Blowing bubbles: Good for training purposes, however not suitable for evaluation. Ring Toss: The initial idea was to have the user carrying the rings and position it properly on top of the poles. However, the rings moving with the tool in the space was an additional depth information. Likewise, this method was found only suitable for training purposes. Motivation: Well our intention was develop an interactive gaming framework, which on the first place is more interesting from user’s perspective, and secondly it could be used at the same time for training Limitations: However, the bottle neck is many gaming scenarios give additional information to the user, which the rate of success will increase to a high level. Blowing Bubbles: Assuming in this case, if the task is to have some virtual spheres (which represent the lymph nodes) and moving the tool until you hit the sphere, and then the sphere acts like a bubble which blows out when it gets hit. Now, in this scenario the user is informed that he has not hit the object, until it blows out. Thus, one can think of this method suitable for training purposes used until the user becomes comfortable with the depth of the objects, however it cannot be used for evaluation purposes. Ring Toss: Likewise, imagine a game were there are a set of rings used as a substitute for the spheres, and is asked to place them correctly on a set of poles. In this scenario, the rings should be steered and moved according to the motion of the tool carried by the user. Thus, with moving rings, the user obtains additional depth information when it is synced to the motion of his hands. Donnerstag, 15. November 2018

Requirements Leap motion: Elongated tool to perform the task Donnerstag, 15. November 2018 Requirements Leap motion: Senses the hand’s movements Composed of 2 IR cameras Similar to Kinect, but smaller observation area with higher resolution Elongated tool to perform the task Leap motion: The leap motion controller senses how you naturally move your hands and it demonstrates your hand (or tool) tracking in your system. The Leap Motion controller is a small USB peripheral device which is designed to be placed on a physical desktop, facing upward. Using two monochromatic IR cameras, the device observes a roughly hemispherical area, to a distance of about 1 meter away. The cameras generate almost 300 frames per second of reflected data, which is then sent through a USB cable to the host computer, where it is analyzed by the Leap Motion controller software by synthesizing 3D position data from the 2D frames generated by the two cameras. The smaller observation area and higher resolution of the device differentiates the product from the Kinect. Donnerstag, 15. November 2018

Demo Donnerstag, 15. November 2018

Precision Test Maximum precision: 2mm Donnerstag, 15. November 2018

Evaluation Results obtained from previous experiments Donnerstag, 15. November 2018 Evaluation Results obtained from previous experiments   Animation 0 Animation 1 Animation 2 Animation 3 Glow 2 3 Transparency 5 4 Fog 6 9 60 16 60 14 60 Current evaluation on 5 separate depth cues Donnerstag, 15. November 2018

Donnerstag, 15. November 2018 Conclusion All these depth cues are necessary, but still cannot model the virtual world solely. Participants found the Heat Map as the most helping cue. Participants found Shadow and Reflection helping almost identically. Donnerstag, 15. November 2018

Future Development Segmentation Donnerstag, 15. November 2018 Future Development Segmentation Improved environment (e.g. Oculus Rift) From VR to AR Educating the user Appealing UI Improved smoothing Frequency based sampling: zero crossing, change of gradient direction and using the normal information State of the Art: iPhone video stabilization technique [1] It makes more sense for the user to observe his hands and tools in the virtual environment. Oculus rift is a Virtual Reality head mounted display which provides a more realistic environment than the computer display, as the user can move his head and observe a 3D scene. From VR to AR: To provide a more realistic evaluation, the evaluation environment should be as close as possible to the application. For the sake of medical purposes, an AR environment instead of a fully Virtual environment will lead to better evaluation. Educating: Like any medical technology, you have to educate the user. Very realistic depth cues are great, but would be still very prone to error without training. iPhone: automatic post-processing with adding constraints to mimic the natural move of the hand by avoiding sudden changes in the acceleration and constraints to reach constant velocity. Leap signal zoomed in a ms window Leap signal 1. Grundmann, Matthias, Vivek Kwatra, and Irfan Essa. "Auto-directed video stabilization with robust l1 optimal camera paths." In Computer Vision and Pattern Recognition (CVPR), 2011 IEEE Conference on, pp. 225-232. IEEE, 2011. Donnerstag, 15. November 2018