Presentation is loading. Please wait.

Presentation is loading. Please wait.

Gesture-Based Interactive Beam- Bending By Justin Gigliotti Mentored by Professor Tarek El Dokor And Dr. David Lanning Arizona/NASA Space Grant Symposium.

Similar presentations


Presentation on theme: "Gesture-Based Interactive Beam- Bending By Justin Gigliotti Mentored by Professor Tarek El Dokor And Dr. David Lanning Arizona/NASA Space Grant Symposium."— Presentation transcript:

1 Gesture-Based Interactive Beam- Bending By Justin Gigliotti Mentored by Professor Tarek El Dokor And Dr. David Lanning Arizona/NASA Space Grant Symposium University of Arizona April 19, 2008

2 Introduction to gesture-based interactive beam-bending A teaching tool that allows students to interact with a three-dimensional virtual I-beam –User input is captured with a visual sensor –Markerless tracking and associated gesture recognition is performed on the captured frames –The point tracked is interpreted as an applied force to the beam –The I-beam deforms according to the applied force –The I-beam stresses are color coded and shown according to the applied force to the beam –The 3D model is rendered in real-time

3 Advances made to the project Added a third degree of freedom to the gesture-based interaction –The third degree of freedom allows the user to apply tension and compression forces to the beam –Application of these forces requires the user to move the hand toward or away from the visual sensor –Specialized machine vision algorithms are utilized to analyze the gesture The user interface is updated to reflect the extra degree of freedom –The tension and compression forces act to respectively increase or decrease the length of the beam –The corresponding axial load and axial displacement are displayed to the user in the statistics window

4 Gesture-based interaction A single point of motion is used to determine the direction of beam deformation –The point serves as a reference for beam bending –The point is treated as if the user reached out and grabbed the end of the virtual beam The user has three degrees of freedom –The hand tracking algorithm tracks the depth of the user’s hand –This allows movement forward/backward in addition to up/down and left/right

5 Beam bending Once the gesture is tracked, an applied force is simulated based on the gesture Beam bending algorithm –The simulated applied force is used to calculate a beam deformation model for the X, Y and Z directions –The individual stresses across the entire beam are calculated –The beam is rendered in virtual reality and color coded according to the stresses distributed across the beam

6 Deformation model Y-Direction deflection:Corresponding change in length: where, Z-Direction deflection: Point Stress:

7 Real-time rendering in virtual reality Using the calculated deformation model, the beam is mapped into 3D space The 3D space allows the beam to be rendered point by point, therefore, each point is colored according its stress value Vertical load and displacement, horizontal load and displacement, axial load and displacement and the beam’s modulus of elasticity are all updated based on the physical parameters of the beam and the force being applied Real-time update –Tracking, applied force calculations, deflection calculations and stress color coding are all performed numerous times every second

8 Beam bending environment

9 Demonstration

10 Conclusion The beam bending project –User motion is tracked and a reference point is calculated –An applied force is simulated based on the point tracked by the machine vision algorithms –Deformation model complete with statistics and stresses is created –The beam is rendered in 3D and statistics are displayed Result: low-cost teaching tool One conference paper and one journal paper were submitted on this work


Download ppt "Gesture-Based Interactive Beam- Bending By Justin Gigliotti Mentored by Professor Tarek El Dokor And Dr. David Lanning Arizona/NASA Space Grant Symposium."

Similar presentations


Ads by Google