Touch-Enabled Interfaces Ming C. Lin Department of Computer Science University of North Carolina at Chapel Hill http://www.cs.unc.edu/~lin lin@cs.unc.edu
Human Computer Interaction
Human Computer Interaction Visual (graphics, vision, etc) Auditory (sound) Haptic (touch-enabled) Others
Common Touch-based Interfaces
Other Touch-based Interfaces
Benefits Augment other senses Inherent 3D interfaces Physically-based interaction Assisted Technology Natural & Intuitive
What Is Haptic Rendering? Master-Slave Systems Human-in-the-Loop Robot Force Feedback Haptic Device Human Simulation Tactile Feedback In the beginning, haptics was thought for master-slave systems. Now, used in virtual reality. Outputs to the user: tactile feedback (in the skin), force feedback (in the tendons). An extra sensory input to the human. However, the fact that the human closes the control loop adds strong requirements (stability, etc.) to the ‘graphics’ application. Look at the arrows. The data is bi-directional. Virtual Reality
Inter-disciplinary Research Computer Science Electrical Engineering Mechanical Engineering Haptic Rendering Control and actuators Mechanical design I like haptic becomes it joins together different engineering disciplines. We’ll focus on the haptic rendering: the computation of the forces that are output through the device. Computation of the forces output by the device
Control of Haptic Devices Impedance Devices Admittance devices 6-DOF Phantom COBOTs Explain what COBOTs are for.
Engine Close-Up Boeing VPS System
Collaborative Haptic Design Review
Other Examples A Haptic Hybrid Controller for Virtual Prototyping of Vehicle Mechanisms (Ford, BMW, etc) 3-DOF Cobot for Engineering Design (Northwestern University and Ford Automobile)
Medical Simulators Endoscopy simulator - Bronchoscopy and upper and lower gastrointestinal procedures on a single platform Endovascular simulator - Percutaneous coronary and peripheral interventions and cardiac rhythm management Hysteroscopy simulator - Skills assessment and myomectomy Laparoscopy simulator - Skills, cholecystectomy, sterilization, ectopic pregnancy, and myomectomy suturing Vascular access simulator - Adult, geriatric, and pediatric IV; PICC; phlebotomy; and skills assessment
Virtual Endoscopic Surgery Training VEST System One (VSOne) Technology 3 haptic (force-feedback) devices as mock-up endoscopic instruments 1 virtual endoscopic camera three new Basic Task Training (BTT) exercises - Find tubes/touch points/follow path
Laparoscopic Surgery MIT Touch Lab
Molecular Dynamics VMD: Visual Molecular Dynamics Humphrey, 1996
Haptic Vector Field Lawrence, Lee, Pau, Roman, Novoselov 5 D.O.F. in University of Colorado at Boulder 5 D.O.F. in 5 D.O.F. out Lawrence, 2000
dAb: Haptic Painting System
inTouch: 3D Haptic Painting Painted Butterfly (~80k triangles) http://gamma.cs.unc.edu/inTouch
inTouch: Multiresolution Modeling with Haptic Interface http://gamma.cs.unc.edu/inTouch
ArtNova: Touch-Enabled 3D Model Design Interactive texture painting User-centric viewing Realistic force response http://gamma.cs.unc.edu/ArtNova
FreeForm Design Model Gallery http://www.sensable.com/freeform-models.htm
Manipulating Gears This clip shows an example of 6-DoF haptic rendering. In the clip I move a model of an upper jaw using a 6-DoF haptic device. Contact force and torque are computed in the simulation, and displayed using the same haptic device. Haptic feedback enables a more intuitive interaction with virtual objects.
Simulated Environment Basic Pipeline User Grasped Object motion motion Scene Haptic Device The basic pipeline for 6-DoF haptic rendering proceeds as follows: The haptic device serves as a tracker of the motion of the user. … Simulated Environment
Simulated Environment Basic Pipeline User Grasped Object motion motion Scene Haptic Device position … The position and orientation of the device are read by the program Simulated Environment
Simulated Environment Basic Pipeline User Grasped Object motion motion Scene Haptic Device position … And applied to the so called grasped object. Simulated Environment
Simulated Environment Basic Pipeline User Grasped Object motion motion Scene Haptic Device position … The next step is to perform collision detection and force computation between objects in the simulated environment Simulated Environment
Simulated Environment Basic Pipeline User Grasped Object motion motion Scene command Haptic Device position … A force command is sent to the haptic device. Simulated Environment
Simulated Environment Basic Pipeline User Grasped Object motion motion forces Scene command Haptic Device position … And finally forces are applied to the user. Simulated Environment
Simulated Environment Basic Pipeline User Grasped Object motion motion forces Scene command Haptic Device position This basic pipeline that I just described is called by some authors Direct Rendering. DIRECT RENDERING Simulated Environment
Haptic Rendering Loop force command Fuser Fdevice position
Problem of Haptic Rendering The user becomes part of the simulation loop. 1KHz is necessary so that the whole system doesn’t suffer from disturbing oscillations. Think of the analogy with numerical integration of a system with spring, mass and damper, where the frequency of the haptic loop sets the integration step. The Phantom haptic devices run their control loop at 1KHz. Consequence: we are very limited on the amount of computation that we can do.
Haptic Rendering Loop High sensitivity to instabilities!! High update rates required!! (kHz for high stiffness) Human-in-the-loop
Key Challenges Collision Detection Interaction Paradigm Choice of representation and algorithm Interaction Paradigm Penalty forces vs. constraint-based optimization Virtual coupling vs. direct rendering Newtonian dynamics / Quasi-static approximation Single user vs. collaboration
Additional Issues Decouple haptic and simulation loops? Use intermediate representations? Force type and quality How hard does hard contact feel? How free does free-space feel? Repulsive forces? Force artifacts / stability considerations