Download presentation
Presentation is loading. Please wait.
Published byBrianna Butler Modified over 6 years ago
1
Haptic Rendering Comp 259 Jingdan Zhang 31, Mar, 2004
Good afternoon, everybody. Today, I will give a presentation about Haptic Rendering. My name is Jingdan Zhang. In this presentation, I will give some rough ideas of haptic rendering. It’s basic ideas, challenges and techniques in this area. Then I will put more focus on textured rendering. More details of the new algorithm developed here by Miguel will be discussed.
2
Haptic Interaction “… process of applying forces representing a virtual environment to the user of a haptic interface, or force-feedback device” First, what is haptic rendering? It is a process of applying forces to the user through a force-feed back device. The major difference between haptic devices and other input interface such as mouse and keyboard is that it will give force feedback based on the virtual environment. Why we graphics gays care about this technique? First, the it is cool, like graphics can generate cool visual effect, it can generate cool tactile feelings. Second, feeling is another kind of vision. Thus haptics has many similarities to Graphics. Many well-developed techniques in graphics area can be reinvented and applied to haptic area. So if you like hacking in graphics, you can also do some hacking in haptics. It is a really good thing.
3
Goal of haptic rendering
Enable a user to touch, feel and manipulate virtual objects Enhance a user’s experience in a synthetic environment Provide a natural intuitive interface Using haptic rendering, we can enable a user to touch, feel and manipulate virtual objects. Enhance a user’s experience in virtual environment. Imaging after twenty years, when we play computer games, we can wear some haptic device. Then we can do some exercise while playing games. Then more computer games, more healthy.
4
Applications of Haptics
Entertainment Education and training Scientific visualization Surgical simulation Robot-assisted surgery – Tele-operation – Cooperative Manipulation The potential applications for haptic rendering is huge. It can used in many computer-human interaction areas, such as Entertainment, Education and training, Scientific visualization, Surgical simulation, Robot-assisted surgery.
5
Human Haptics Two complementary channels: Tactile Kinesthetic
Strictly responsible for the variation of the cutaneous stimuli Presents spatial distribution of forces Kinesthetic Refers to the human perception of one’s own body position and motion Presents only the net force information Now, let us have a basic view of human haptics. Usually, person feel forces from two difference ways. One feeling is tactile. It is strictly responsible for the variation of the skin stimuli. This sense give the information of spatial distribution of the forces. When you stroke the surface using the finger tip, you can get such feeling. Through is feeling, you can recognize the roughness and other properties of the surface. Another channel is kinesthetic sense. This kind of haptic information is related to how much force will exert on the muscle. It only presents net force information. Through this, person can perceive one’s own body position and motion. For example, if you push the wall, your can feel how much force the wall exert to you. Currently, the haptic rendering more focus on presenting kinesthetic force. I think simulating the tactile feeling is more difficult.
6
Main components Haptic interface Graphical objects
Electro-mechanical system Graphical objects Contain shape and other property Haptic rendering algorithm Joins the first two components In haptic rendering system, three are three main components. The first is haptic interface. It is the hardware part of the system. The second is graphics objects. Like graphics, this component is used to represent the virtual objects and their properties in the scene. Actually, in most haptic rendering system, the visual and haptic rendering can share a same data-structure, such as mesh. The third part is haptic rendering algorithm, It will calculate feedback force based on the user action and virtual objects.
7
Haptic interface Sensable Technologies Inc. PHANTOM Desktop
Here are some pictures of Haptic interface devices. Through several years development, some commercial products are already available on the market. The left one is the PHANTOM Desktop from Sensable Technologies Inc. You can see them in the G-lab. The right one is impulse stick. This kind of devices has already become a standard configuration for game consoles. The middle one is Cyber force from Virtual Technologies. It offers the whole-hand and arm-interaction and force feedback. User can use it to grab the virtual objects. Immersion co. Impulse stick Virtual Technologies Inc. Cyber Force
8
Haptic rendering algorithm
Two parts: Collision detection, collision response Here is the framework of haptic rendering algorithm. It mainly composed by two parts. One is collision detection another is collision response. As a user moves his hand, the position and orientation of the probe is acquired. The collision detection is performed to determine whether the probe is colliding with objects. If the collision occurring, the force and torque will calculated based on some force model. This loop should update forces around at 1kHZ in order to bring smooth feeling to the user. So the update rate in haptic rendering is much higher than visual rendering.
9
Example: Hard sphere Here is a simple example of rendering a hard sphere. When the probe is moving in free-space, the feedback force is zero. When the probe penetrate the sphere, the feedback force is generated. A simple spring model is used. The magnitude of the force is proportional to the penetration depth. The direction is pointing out from the center.
10
Collision Detection Computationally fast collision detection: update rate 1KHz H-COLLIDE [Gregory et al. 1999] Spatial Decomposition Bounding Volume Hierarchy based on OBBTrees Frame-to-Frame Coherence Sensation preserving simplification [Oaduy & Lin, 2003] Multiresolution representation From general objects, the fast collision detection algorithm is necessary to keep the 1kHz update rate. The gamma group here has done a lot of successful research project on this area. One is H-Collide. It use the spatial decomposition, Bounding Volume Hierarchy based on OBBTrees and frame-to-frame coherence to accelerate the detection process. Another one is published in last year’s Sigggraph. It use multi-resolution representation. When doing mesh simplification, the surface details is filtered when it can not perceived by the sense of touch.
11
The models of the probe a point a 3D object a line segment
For the haptic probe, several models can be used. The simplest one is point-based model. The end point of the probe is considered as a virtual point in the scene. In this case, torque is not considered. User can only feel the force. The more complicated model is line-based and objects-based. In both these case, torque and force are all considered. The six-degree-of-freedom haptic interface is needed. Also, the probe may collide with multi-object as the same time. Then the net force and torque should displayed. a line segment
12
Handle the collision: Penalty methods
Determine the feedback force directly from penetration depth Subdivide the object volume and associate each a sub-volume with each surface Work well for simple objects Now let us see how to handle the collision in point-based model. The simple way to do this is determine the feedback force directly from penetration depth, as shown in previous example. In order to accelerate this process, we can subdivide the object volume and associate each a sub-volume with each surface. This method is work well for simple objects.
13
Limitations of penalty methods
Force discontinuity But this method have several limitation. First is lack of locality. In this figure, when the end point of the probe is in center of the object. It is difficult to determine which should be the exterior surface. Second is that it will cause force discontinuity. In this figure, when the probe penetrate the block depth enough, the nearest exterior surface will change abruptly. Then force direction will also change. The third problem is it can not prevent the probe to penetrate the thin objects. In this figure, if it do not provide sufficient force, then the probe can continue to move. Then force disappears suddenly. Lack of locality “Pop-Thru” of thin Object
14
Handle the collision: Constrained based method
Constrain a virtual proxy of the haptic interface to ramain on the surfaces Gold object [Zilles et al. 1997] Virtual Proxy [Ruspini et al. 1997] IHIP & HIP [Ho et al. 1999] The constrained based methods are proposed to handle these problem. The basic idea is that, although we can not keep the end point from penetration the surface, we can create a virtual a virtual proxy of the end point and keep it on the surfaces. Several papers are proposed based on this idea.
15
Virtual Proxy(1) Here is the more detailed explanation. The blue one is real position of the end point, the green on the proxy position. When the end point is moving in free space, the virtual position and real position are overlapping. When the end point is penetrating the object, the proxy stay on the surface and try to approach the end point as close as possible. When the end point pass through the object, the proxy will still stay on the surface and hold on the probe.
16
Virtual Proxy(2) Here is the formula of the constrains on the proxy. The first one try to minimize the distance from proxy to end point. The following ones constrain the proxy stay on the surface. n is the normal of the neighboring surfaces
17
Surface properties Normal Contact Impedance Friction Texture
Let us discuss the force model used in haptic rendering. Following are the basic elements we should consider. Them include surface normal, contact impedance, friction and texture. We will discuss each of them in detail.
18
Force Shading Render object surfaces as smooth and continuous, even when the underlying representation is not Analogous to Phong Shading Surface curvature sensations can be convincingly displayed by control of the normal force vector. Like Phong shading, we can use force shading to render object surfaces as smooth and continuous, even when the underlying representation is not. The implement is quite simple, just interpolate surface normal using neighboring vertices.
19
Contact Impedance Happening when user contact the surface
Perpendicular to the surface Spring force and viscous damping force This kind of force is happening when user contact the surface, as I showed in previous example. It is along the normal direction. Usually, the spring model is used. The magnitude of the force is proportional to the penetration depth. Sometimes, a viscous damping term may be added.
20
Friction Happening when user stroke the surface
The lateral force, opposite to motion The function of the coefficient of friction and normal force Coulomb friction: static and dynamic friction This kind of force is happening when user stroke the surface. It is the tangential force and opposite to the direction of the motion. It is proportional to normal force. A good force model is Coulomb friction model. In these model, it will have static friction coefficient and dynamic coefficient. Bigger force is needed to change the end point from static state to dynamic state, while smaller force can sustain the dynamic state.
21
Texture Varied ways to represent and display texture
Deterministic textures Stochastic Models Haptic recordings Texture is the small geometric elements on surface. It will determine the basic characters, such as roughness, of the surface. Haptic rendering of textured surface will give the user more information of the virtual objects. Three categories of the methods are used. The first is deterministic textures. This category of algorithms calculate the force based on surface geometry and some physical models. The second is stochastic models. It try to use stochastic process to model the texture and feedback force. The third one is haptic recordings. It is a record and play technique similar to Image-based modelong.
22
Lateral-force gradient algorithm(1)
Minsky’s PhD thesis 1995 Texture is represented as 2D height field 2 DOF, render the texture as tangential vibration force Assume the surface is frictionless Here is a simple deterministic texture rendering technique proposed by Minsky in her PhD thesis. In the algorithm, texture is represented as 2D height field. Two degree of freedom haptic interface is used. Two degree means the direction of the force can only in a plane. It renders the texture as tangential vibration force. In this algorithm, the friction is ingored.
23
Lateral-force gradient algorithm(2)
Moving from x1 to x2 : ΔPE = work = F*D At the same time: ΔPE = PE2-PE1 = (l1 – l2)mg So: F = mgtanθ Here is how the algorithm works. Suppose the end point of the probe moves from l1 to l2. Supposing the lateral force F is exerting in this period, the total work done by this force is F*D. As the same time, the potential energy of the probe is increase as: . These two should be equal, because there is no friction. From these two equation we have F= mgtanθ . It means the force is proportional to the gradient. This model can produce vibration force to express the roughness of the surface. The basic idea in this algorithm is we determine the potential energy change from one state to another state. Then we figure out how much force will be needed in this process. This idea will be used later.
24
What new? Haptic rendering of interaction between textured objects [Otaduy, Jain, Sud, Lin 2004] Use texture (height field) to encode surface details A new force model Fast calculation of directional penetration depth( Implement on GPU) This year, a new textured rendering algorithm is developed here in gamma group. It can render the interaction between textured objects. It has three main features. It use coarse mesh to present the object’s shape and height field to encode surface details. It uses a new force model. It proposed a fast algorithm to calculate directional penetration depth and implement it on GPU.
25
Preprocessing Parameterize & Create atlas Simplification
Here is the preprocessing of this algorithm. The model with fine details is parameterized and the fine details are stored in the height field atlas. The texture coordinate is recorded. Then the model is simplified to represent the basic shape of the object. The texture coordinates are retained in simplification. Simplification
26
Rendering pipeline Step1: Perform collision detection at each step. Determine pairs of contact points and penetration directions based on coarse model. Step2: Calculate the force and torque for each contact based on texture. Step 3: Compute the net force and torque. Then algorithm perform haptic rendering in three step. … The step one is collision detection. It use the technique proposed by another paper. The step three is ideally simple. We will focus on step 2.
27
Penetration depth Intersection of A & B Global penetration depth
First, let us define the penetration depth. Object A and B are intersecting. The global penetration depth is defined as the minimal translational distance required to separate these two objects. The Intersection of A & B Global penetration depth Directional penetration depth n along n
28
Penetration depth of height field
For the penetration
29
Calculate height function on surface
v Flatten n u Approximate Height Function
30
The force model(1) Penalty-based force, defined as elastic potential field Define force F and torque T as
31
The force model(2) In the local reference system {u, v, n}
In the global reference system
32
The force model(3) The partial derivatives
can be obtained by translating the object Δu along the u axis and computing the directional penetration depth
33
Experiment result Textured blocks File and CAD part Hammer and torus
Here are the experiments shown in the paper. Hammer and torus Block and gear
34
Stochastic models(1) What is texture? From observation
an image that looks approximately the same, to humans, from neighborhood to neighborhood From stochastic view multidimensional signal obeying some statistical properties Next, let us see something in stochastic approach. First, what is texture? The picture in the right is the texture from the concrete. From the observation, we can say the texture is an image that looks approximately the same, to humans, from neighborhood to neighborhood. In stochastic view, the texture cam be described as multidimensional signal obeying some statistical properties.
35
Stochastic models(2) Basic assumption: Several models:
the feedback forces from texture can be represented as stochastic models Several models: Gaussian distributions Markov random field Stochastic input parameters derived by analyzing actual force data Because the structure of the texture can be seen as a certain stochastic process,
36
The Gaussian distribution
The roughness of many surface can be approximated as Gaussian:
37
A Stochastic Approach(1)
Siira & Pai 1996 For texture rendering, it is suffice to give the correct psychological illustration Two assumptions Normal force is proportional to the height of the asperities Lateral texture forces is proportional to the normal force
38
A Stochastic Approach(2)
If user is in contact and moving tangential to the surface, calculate texture output
39
Haptic recordings Analogous to image-based rendering Two sources
Pre-sampling the forces occurring at different location Play them back in the simulation Two sources From off-line physical-simulation To reduce the real time computation burden From real world texture To avoid the modeling complexity Challenge How to parameterize the recorded forces
40
Conclusion Haptic rendering is cool. You can not see, but you can feel. It is the next frontier for CG research.
41
References Allison Okamura. "Haptics for Virtual Reality." Course note /lectures/lecture09.pdf Basdogan, C., Srinivasan, M.A. "Haptic rendering in virtual environments." network.ku.edu.tr/~cbasdogan/ Tutorials/VRbookChapter.pdf Gregory A., Lin M., Gottschalk S. and Taylor R. "H-Collide: A Framework for Fast and Accurate Collision Detection for Haptic Interaction." In the Proceedings of IEEE Virtual Reality Conference 1999. Ho,C.H., Basdogan,C., and Srinivasan, M. A "Efficient point-based rendering techniques for haptic display of virtual objects." Presence 8, 5, pp Juhani Siira, Dinesh K. Pai, "Haptic Texturing - A Stochastic Approach.", IEEE International Conference on Robotics and Automation, , Minnesota, 1996. Max Smolens. "Haptic Rendering." Course Note.
42
References(2) Miguel A. Otaduy and Ming C. Lin, "Sensation Preserving Simplification for Haptic Rendering." In Proceedings of ACM SIGGRAPH 2003 / ACM Transactions on Graphics, Vol. 22. pp San Diego, Ca Miguel A. Otaduy, Jian, Sud and Ming C. Lin, "Haptic Rendering of Interaction between Textured Models." Submitted to ACM SIGGRAPH 2004. Minsky, M "Computational Haptics: The sandpaper system for synthesizing texture for a force-feedback display." PhD thesis, Ph.D. Dissertation, Program in Media Arts and Sciences, MIT. Ruspini, Kolarov and Khatib. “The haptic display of complex graphical environments.” Proc. ACM SIGGRAPH 1997. Zilles, C.B. and Salisbury, J.K. “A constraint-based god-object method for haptic display.” Proc. IEE/RSJ International Conference on IntelligentRobots and Systems, Human Robot Interaction, and Cooperative Robots, Vol. 3, p , 1995. Salisbury, J.K. et al. “Haptic rendering: programming touch interaction with virtual objects.” Proc. ACM SIGGRAPH 1995.
43
Thank you.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.