Download presentation
1
Touch-Enabled Interfaces
Ming C. Lin Department of Computer Science University of North Carolina at Chapel Hill
2
Human Computer Interaction
3
Human Computer Interaction
Visual (graphics, vision, etc) Auditory (sound) Haptic (touch-enabled) Others
4
Common Touch-based Interfaces
5
Other Touch-based Interfaces
6
Benefits Augment other senses Inherent 3D interfaces
Physically-based interaction Assisted Technology Natural & Intuitive
7
What Is Haptic Rendering?
Master-Slave Systems Human-in-the-Loop Robot Force Feedback Haptic Device Human Simulation Tactile Feedback In the beginning, haptics was thought for master-slave systems. Now, used in virtual reality. Outputs to the user: tactile feedback (in the skin), force feedback (in the tendons). An extra sensory input to the human. However, the fact that the human closes the control loop adds strong requirements (stability, etc.) to the ‘graphics’ application. Look at the arrows. The data is bi-directional. Virtual Reality
8
Inter-disciplinary Research
Computer Science Electrical Engineering Mechanical Engineering Haptic Rendering Control and actuators Mechanical design I like haptic becomes it joins together different engineering disciplines. We’ll focus on the haptic rendering: the computation of the forces that are output through the device. Computation of the forces output by the device
9
Control of Haptic Devices
Impedance Devices Admittance devices 6-DOF Phantom COBOTs Explain what COBOTs are for.
10
Engine Close-Up Boeing VPS System
11
Collaborative Haptic Design Review
12
Other Examples A Haptic Hybrid Controller for Virtual Prototyping of Vehicle Mechanisms (Ford, BMW, etc) 3-DOF Cobot for Engineering Design (Northwestern University and Ford Automobile)
13
Medical Simulators Endoscopy simulator - Bronchoscopy and upper and lower gastrointestinal procedures on a single platform Endovascular simulator - Percutaneous coronary and peripheral interventions and cardiac rhythm management Hysteroscopy simulator - Skills assessment and myomectomy Laparoscopy simulator - Skills, cholecystectomy, sterilization, ectopic pregnancy, and myomectomy suturing Vascular access simulator - Adult, geriatric, and pediatric IV; PICC; phlebotomy; and skills assessment
14
Virtual Endoscopic Surgery Training
VEST System One (VSOne) Technology 3 haptic (force-feedback) devices as mock-up endoscopic instruments 1 virtual endoscopic camera three new Basic Task Training (BTT) exercises - Find tubes/touch points/follow path
15
Laparoscopic Surgery MIT Touch Lab
16
Molecular Dynamics VMD: Visual Molecular Dynamics Humphrey, 1996
17
Haptic Vector Field Lawrence, Lee, Pau, Roman, Novoselov 5 D.O.F. in
University of Colorado at Boulder 5 D.O.F. in 5 D.O.F. out Lawrence, 2000
18
dAb: Haptic Painting System
19
inTouch: 3D Haptic Painting
Painted Butterfly (~80k triangles)
20
inTouch: Multiresolution Modeling with Haptic Interface
21
ArtNova: Touch-Enabled 3D Model Design
Interactive texture painting User-centric viewing Realistic force response
22
FreeForm Design Model Gallery
23
Manipulating Gears This clip shows an example of 6-DoF haptic rendering. In the clip I move a model of an upper jaw using a 6-DoF haptic device. Contact force and torque are computed in the simulation, and displayed using the same haptic device. Haptic feedback enables a more intuitive interaction with virtual objects.
24
Simulated Environment
Basic Pipeline User Grasped Object motion motion Scene Haptic Device The basic pipeline for 6-DoF haptic rendering proceeds as follows: The haptic device serves as a tracker of the motion of the user. … Simulated Environment
25
Simulated Environment
Basic Pipeline User Grasped Object motion motion Scene Haptic Device position … The position and orientation of the device are read by the program Simulated Environment
26
Simulated Environment
Basic Pipeline User Grasped Object motion motion Scene Haptic Device position … And applied to the so called grasped object. Simulated Environment
27
Simulated Environment
Basic Pipeline User Grasped Object motion motion Scene Haptic Device position … The next step is to perform collision detection and force computation between objects in the simulated environment Simulated Environment
28
Simulated Environment
Basic Pipeline User Grasped Object motion motion Scene command Haptic Device position … A force command is sent to the haptic device. Simulated Environment
29
Simulated Environment
Basic Pipeline User Grasped Object motion motion forces Scene command Haptic Device position … And finally forces are applied to the user. Simulated Environment
30
Simulated Environment
Basic Pipeline User Grasped Object motion motion forces Scene command Haptic Device position This basic pipeline that I just described is called by some authors Direct Rendering. DIRECT RENDERING Simulated Environment
31
Haptic Rendering Loop force command Fuser Fdevice position
32
Problem of Haptic Rendering
The user becomes part of the simulation loop. 1KHz is necessary so that the whole system doesn’t suffer from disturbing oscillations. Think of the analogy with numerical integration of a system with spring, mass and damper, where the frequency of the haptic loop sets the integration step. The Phantom haptic devices run their control loop at 1KHz. Consequence: we are very limited on the amount of computation that we can do.
33
Haptic Rendering Loop High sensitivity to instabilities!!
High update rates required!! (kHz for high stiffness) Human-in-the-loop
34
Key Challenges Collision Detection Interaction Paradigm
Choice of representation and algorithm Interaction Paradigm Penalty forces vs. constraint-based optimization Virtual coupling vs. direct rendering Newtonian dynamics / Quasi-static approximation Single user vs. collaboration
35
Additional Issues Decouple haptic and simulation loops?
Use intermediate representations? Force type and quality How hard does hard contact feel? How free does free-space feel? Repulsive forces? Force artifacts / stability considerations
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.