Download presentation
Presentation is loading. Please wait.
Published byDarcy Weaver Modified over 9 years ago
1
Experiences in Extemporaneous Incorporation of Real Objects in Immersive Virtual Environments Benjamin Lok University of Florida Samir Naik Disney VR Studios Mary Whitton, Frederick P. Brooks Jr. University of North Carolina at Chapel Hill March 28 th, 2004
2
Objects in Immersive VEs Most are virtual (tools, parts, users’ limbs) Most are virtual (tools, parts, users’ limbs) Not registered with a corresponding real object (limited shape and motion information) Not registered with a corresponding real object (limited shape and motion information) Example: Unscrewing a virtual oil filter from a car engine model Example: Unscrewing a virtual oil filter from a car engine model Ideally: handle real objects Ideally: handle real objects improved look, feel, affordance, interaction improved look, feel, affordance, interaction Solution: tracking and modeling dynamic objects Solution: tracking and modeling dynamic objects
3
Extemporaneously Incorporating Real Objects Motivation Motivation Object reconstruction system Object reconstruction system Experiences in creating three different VE projects Experiences in creating three different VE projects Motivate others to explore Motivate others to explore Benefits Benefits More efficient exploration of designs and scenarios More efficient exploration of designs and scenarios Faster development Faster development More natural interaction More natural interaction More like actual task More like actual task Negatives Negatives Accuracy and fidelity Accuracy and fidelity Poor haptic response from virtual objects Poor haptic response from virtual objects Hard to get input (i.e. button presses) from real objects Hard to get input (i.e. button presses) from real objects
4
Real-time Object Reconstruction System Inputs: outside-looking-in camera images of real objects in a scene Inputs: outside-looking-in camera images of real objects in a scene Outputs: an approximation of the real objects (visual hull) Outputs: an approximation of the real objects (visual hull) Interactive rates (15-18 fps, ~1cm error) Interactive rates (15-18 fps, ~1cm error) Handle dynamic objects (generate a virtual representation) Handle dynamic objects (generate a virtual representation) SGI Reality Monster – handles up to 7 video feeds (PC solution possible) SGI Reality Monster – handles up to 7 video feeds (PC solution possible)
5
Reconstruction Algorithm … 1. Start with live camera images 2. Image Subtraction 3. Use images to calculate volume intersection 4. Composite with the VE …
6
Managing Collisions Between Virtual and Dynamic Real Objects We want virtual objects to respond to real object avatars We want virtual objects to respond to real object avatars Requires detecting when real and virtual objects intersect Requires detecting when real and virtual objects intersect If intersections exist, determine plausible responses If intersections exist, determine plausible responses Only virtual objects can move or deform at collision. Only virtual objects can move or deform at collision. Merging the real and virtual spaces. Merging the real and virtual spaces.
7
Case Studies User Study (VR2003) User Study (VR2003) Interaction: Manipulate real objects in VR Interaction: Manipulate real objects in VR Development time: Development time: Virtual: 3 weeks Virtual: 3 weeks Hybrid: 1 day Hybrid: 1 day Physics Simulation (I3D2003/SIGGRAPH2003) Physics Simulation (I3D2003/SIGGRAPH2003) NASA (ACM 8/04) NASA (ACM 8/04)
8
Real Space Environment User manipulated blocks to replicate a pattern User manipulated blocks to replicate a pattern Traditional approaches: Traditional approaches: Use real blocks with camera or magnetic tracking Use real blocks with camera or magnetic tracking Purely virtual approach Purely virtual approach
9
Purely Virtual Environment Participant manipulated virtual objects Participant manipulated virtual objects Participant was presented with a generic avatar Participant was presented with a generic avatar
10
Hybrid Environment Participant manipulated real objects Participant manipulated real objects Participant was presented with a generic avatar Participant was presented with a generic avatar
11
Visually-Faithful Hybrid Env. Participant manipulated real objects Participant manipulated real objects Participant was presented with a visually faithful avatar Participant was presented with a visually faithful avatar
12
Virtual vs. Hybrid Virtual Virtual Incorporate pinch gloves and magnetic tracking system Incorporate pinch gloves and magnetic tracking system Register with avatar model Register with avatar model Model virtual blocks Model virtual blocks Hybrid Hybrid Two-handed interaction Two-handed interaction Motion constraints Motion constraints Improved task performance Improved task performance Increased visual (reconstruction) errors Increased visual (reconstruction) errors More like actual task More like actual task
13
Case Studies User Study (VR2003) User Study (VR2003) Physics Simulation (I3D2003/SIGGRAPH2003) Physics Simulation (I3D2003/SIGGRAPH2003) Combined hybrid system with collision detection package (swift++ and other simple simulations) Combined hybrid system with collision detection package (swift++ and other simple simulations) New ‘collision’ function call New ‘collision’ function call Development time: 1 day Development time: 1 day NASA (ACM 8/04) NASA (ACM 8/04)
14
Merging real and virtual spaces
15
Case Studies User Study (VR2003) User Study (VR2003) Physics Simulation (I3D2003/SIGGRAPH2003) Physics Simulation (I3D2003/SIGGRAPH2003) NASA (ACM 8/04) NASA (ACM 8/04) Development time: Development time: Read in NASA models: 3 days Read in NASA models: 3 days Create and implement task: 1 day Create and implement task: 1 day
16
NASA Langley Research Center (LaRC) Payload Assembly Task Given payload models, designers and engineers want to evaluate: Given payload models, designers and engineers want to evaluate: Assembly feasibility Assembly feasibility Assembly training Assembly training Repairability Repairability Current Approaches Current Approaches Measurements Measurements Design drawings Design drawings Step-by-step assembly Step-by-step assembly instruction list instruction list Low fidelity mock-ups Low fidelity mock-ups
17
Task Wanted a plausible task given common assembly jobs. Wanted a plausible task given common assembly jobs. Abstracted a payload layout task Abstracted a payload layout task Screw in tube Screw in tube Attach power cable Attach power cable Determine how much space should be allocated between the TOP of the PMT and the BOTTOM of Payload A Determine how much space should be allocated between the TOP of the PMT and the BOTTOM of Payload A
18
Videos of Task Having a hybrid environment provides substantial benefits in prototype design.
19
Results Participant #1#2#3#4 (Pre-experience) How much space is necessary? 14 cm 14.2 cm 15 – 16 cm 15 cm (Pre-experience) How much space would you actually allocate? 21 cm 16 cm 20 cm 15 cm Actual space required in VE 15 cm 22.5 cm 22.3 cm 23 cm (Post-experience) How much space would you actually allocate? 18 cm 16 cm (modify tool) 25 cm 23 cm The tube was 14 cm long, 4cm in diameter.
20
Case Study Conclusions Object reconstruction VEs benefits: Object reconstruction VEs benefits: Specialized tools and parts require no modeling Specialized tools and parts require no modeling Short development time to try multiple designs Short development time to try multiple designs Shows promise for early testing of subassembly integration from multiple suppliers Shows promise for early testing of subassembly integration from multiple suppliers Possible to identify assembly, design, and integration issues early that results in considerable savings in time and money. Possible to identify assembly, design, and integration issues early that results in considerable savings in time and money.
21
Evaluation of System Benefits Benefits Incorporate objects traditionally difficult to track and model (cable, hands, tools, parts) Incorporate objects traditionally difficult to track and model (cable, hands, tools, parts) Rapid development Rapid development Easy to make system changes Easy to make system changes More real objects to handle improves task performance More real objects to handle improves task performance Could enhance interaction such that applications typically not helped by VEs could be. Could enhance interaction such that applications typically not helped by VEs could be. Negatives Negatives Accuracy and fidelity issues Accuracy and fidelity issues Lack of active haptic response Lack of active haptic response Complex input from real objects Complex input from real objects
22
Collaborators Benjamin Lok, Kyle Johnsen, Cyrus Harrison University of Florida Graduate Students Cathy Zanbaka, Jonathan Jackson Sabarish Babu, Dan Xiao, Amy Ulinski Sabarish Babu, Dan Xiao, Amy Ulinski Jee-In Kim, Min Shin, Larry Hodges University of North Carolina at Charlotte
23
Avatars in VE Most virtual environments do not provide an avatar (user self-representation) Most virtual environments do not provide an avatar (user self-representation) Why? Because tracking the human body is difficult. Why? Because tracking the human body is difficult. Solution: Use simple computer vision to track colored markers to generate an avatar Solution: Use simple computer vision to track colored markers to generate an avatar
24
Collaborators Benjamin Lok University of Florida Danette Allen NASA Langley Research Center
25
Evaluate compromise: Wait 10 minutes to get an object into the VE Evaluate compromise: Wait 10 minutes to get an object into the VE Collaboration w/ Mars Airplane (NASA Langley Research Center) Collaboration w/ Mars Airplane (NASA Langley Research Center) Get tools, parts, and other (possibly distributed) collaborators in a shared space Get tools, parts, and other (possibly distributed) collaborators in a shared space
26
4. Handle real objects, visualize and interact with virtual objects (3) Affix Markers(1) Scan Tool(2) 3D Model (1) A tool is scanned with a 3D laser scanning device. (2) A 3D model of the tool is generated. (3) Markers are affixed for camera-based tracking. (4) (Left – 3 rd person) The user handles real objects, (Right – 1 st person) while interacting with virtual models in the HE. Scan and Track
27
Final Thoughts Fewer generic devices Fewer generic devices Fewer gloves, wands, joysticks, mice. Fewer gloves, wands, joysticks, mice. More relevant real objects as interfaces More relevant real objects as interfaces Object reconstruction Object reconstruction Scan and Track Scan and Track Benefits Benefits More efficient exploration of designs and scenarios More efficient exploration of designs and scenarios Faster development Faster development More natural interaction More natural interaction More like actual task More like actual task Negatives Negatives Accuracy and fidelity Accuracy and fidelity Poor haptic response from virtual objects Poor haptic response from virtual objects Hard to get input (i.e. button presses) from real objects Hard to get input (i.e. button presses) from real objects
28
Thanks Collaborators Danette Allen (NASA LaRC) UF Virtual Experiences Group UNC-C Virtual Environments Group UNC-CH Effective Virtual Environments For more information: http://www.cise.ufl.edu/~lok Funding Agencies The LINK Foundation NIH (Grant P41 RR02170) National Science Foundation Office of Naval Research
29
Previous Work Incorporating Real Objects into VEs Incorporating Real Objects into VEs Virtualized Reality (Kanade, et al.) Virtualized Reality (Kanade, et al.) Image Based Visual Hulls [Matusik00, 01] Image Based Visual Hulls [Matusik00, 01] 3D Tele-Immersion [Daniilidis00] 3D Tele-Immersion [Daniilidis00] Interaction Interaction Commercial solutions (tracked mice, gloves, joysticks) Commercial solutions (tracked mice, gloves, joysticks) Augment specific objects for interaction Augment specific objects for interaction Doll’s head w/ trackers [Hinkley1994] Doll’s head w/ trackers [Hinkley1994] Plate [Hoffman1998] Plate [Hoffman1998] Virtual object – real object interaction Virtual object – real object interaction a priori modeling and tracking [Breen1996] a priori modeling and tracking [Breen1996]
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.