The Frontiers of HCI Jim Warren derived from lots of places and with thanks to Beryl Plimmer Haptics, Smell and Brain Interaction.

Slides:



Advertisements
Similar presentations
HAPTICS.
Advertisements

SEMINAR ON VIRTUAL REALITY 25-Mar-17
Feedback A response resulting from some form of input as in a reflex reaction that results from a stimulus The source of our interaction with the physical.
A New Generation of Surgical Technique: Telesurgery Using Haptic Interfaces By Sarah L. Choy ~ A haptic interface is a force reflecting device which allows.
Virtual Reality Design Virtual reality systems are designed to produce in the participant the cognitive effects of feeling immersed in the environment.
Lesson Overview 31.4 The Senses.
The Frontiers of HCI Touch and Movement some from Chapter 14 Heim Odour and Brain various sources Touch and Movement Smell and Other Interaction Devices.
Importance of Computers in Health Care Medical Information Systems Medical Information Systems The mission of the MISU is to develop and evaluate automated.
Class 6 LBSC 690 Information Technology Human Computer Interaction and Usability.
MUltimo3-D: a Testbed for Multimodel 3-D PC Presenter: Yi Shi & Saul Rodriguez March 14, 2008.
Virtual Reality. What is virtual reality? a way to visualise, manipulate, and interact with a virtual environment visualise the computer generates visual,
Teleoperation Interfaces. Introduction Interface between the operator and teleoperator! Teleoperation interface is like any other HMI H(mobile)RI = TI.
Discussion Silvia Lindtner INF 132 April 07. Fitts’ law - recap A predictive model of time to point at an object Help decide the location and size of.
Virtual Reality Virtual Reality involves the user entering a 3D world generated by the computer. To be immersed in a 3D VR world requires special hardware.
Hardware Specialised Devices
2.03B Common Types and Interface Devices and Systems of Virtual Reality 2.03 Explore virtual reality.
- Talkback with Dark screen Rapid key input and Speak PW - Font Size - Negative Colors - Magnification gestures - Notification reminder - Colour adjustment.
People in multimedia Systems. Multimedia Systems Multimedia systems are designed by a team of people who specialise in a particular field, For example:
2.5/2.6/2.7.  Virtual Reality presents a world in 3d space  Regular input devices such as a mouse only has 2 degrees of movement when 6 is needed for.
Sensation and Perception
SUBMITTED TO SUBMITTED BY Lect. Sapna Gambhir Neha MNW-888-2k11 CN.
Copyright John Wiley & Sons, Inc. Chapter 3 – Interactive Technologies HCI: Developing Effective Organizational Information Systems Dov Te’eni Jane.
Welcome to CGMB574 Virtual Reality Computer Graphics and Multimedia Department.
Computers in the real world Objectives Understand the terms input and output Look at different types of input devices – Sensors / actuators – Human interface.
There are different types of translator. An Interpreter Interpreters translate one instruction at a time from a high level language into machine code every.
Sensing self motion Key points: Why robots need self-sensing Sensors for proprioception in biological systems in robot systems Position sensing Velocity.
Multimedia Specification Design and Production 2013 / Semester 2 / week 8 Lecturer: Dr. Nikos Gazepidis
Technology Transfer: Haptic Feedback Explain technology transfer and technological innovation.
JEOPARDY How do Human Sensors Work? Center for Computational Neurobiology, University of Missouri.
Vision Hearing Other Senses Perception 1 Perception 2.
A sensor for measuring the acceleration of a moving or vibrating body.
 An eye tracking system records how the eyes move when a subject is sitting in front of a computer screen.  The human eyes are constantly moving until.
Josh Stephens Comp Characteristics Degrees of Freedom: particular, independent way that a body moves in space Input type/Frequency of data: Discrete:
Sensation and Perception
Virtual Reality Lecture2. Some VR Systems & Applications 고려대학교 그래픽스 연구실.
GENESIS OF VIRTUAL REALITY  The term ‘Virtual reality’ (VR) was initially coined by Jaron Lanier, founder of VPL Research (1989)..
Chapter 3 Sensation and Perception McGraw-Hill ©2010 The McGraw-Hill Companies, Inc. All rights reserved.
The human 11 Lecture 2 chapter 1 the human 1 of 3.
Spatiotemporal Information Processing No.4 3 components of Virtual Reality-3 Display System Kazuhiko HAMAMOTO Dept. of Information Media Technology, School.
The Frontiers of HCI Touch and Movement from Chapter 14 Heim Odour and Brain various sources Touch and Movement Smell and Direct Brain Interaction.
DO NOW : List the 5 senses and an organ associated with each sense. Then list an object detected by each sense. (Ex. Ear and a bell) Objectives: 1.List.
Low cost tactile feedback platform for teleoperation and VR sensing Human Machine Interaction & Low cost technologies Adrien Moucheboeuf - July 8 th, 2015.
2.03 Explore virtual reality design and use.
HCI 입문 Graphics Korea University HCI System 2005 년 2 학기 김 창 헌.
Auditory & tactile displays EGR 412 Human Factors Engineering ISE
Immersive Displays The other senses…. 1962… Classic Human Sensory Systems Sight (Visual) Hearing (Aural) Touch (Tactile) Smell (Olfactory) Taste (Gustatory)
Sensory Receptors. D.S.Q. 1. What is getting ready to happen to the foot in the picture? 2. What will most likely happen as soon as the feather rubs.
Haptics, Smell and Brain Interaction
Lesson Overview 31.4 The Senses.
Copyright John Wiley & Sons, Inc. Chapter 3 – Interactive Technologies HCI: Developing Effective Organizational Information Systems Dov Te’eni Jane.
The Senses EQ: How does our brain receive and interpret sensory information?
How many senses do we have? An introduction to multimodality
VIRTUAL KEYBOARD By Parthipan.L Roll. No: 36 1 PONDICHERRY ENGINEERING COLLEGE, PUDUCHERRY.
What is Multimedia Anyway? David Millard and Paul Lewis.
Other Senses. THE SKIN SENSES  Pressure, Temperature, Pain  Gate Theory: only a certain amount of information can be processed by the nervous system.
Tactile (Touch) Sense Two different systems: Discriminatory: Tells you where and what you are touching. So that we don’t have to rely on visual cues. Protective:
HAPTIC TECHNOLOGY ASHWINI P 1PE06CS017.
Alternative Methods Of Input
HAPTIC REALITY SYSTEMS
Human Computer Interaction Lecture 20 Universal Design
GESTURE CONTROLLED ROBOTIC ARM
CAPTURING OF MOVEMENT DURING MUSIC PERFORMANCE
INPUT-OUTPUT CHANNELS
Department of Computer Science and Engineering
How do organisms receive and respond to information from their environment? Yesterday and today you worked with your partners on stations that tested your.
Virtual Reality.
There are different types of translator.
Lesson 4 Alternative Methods Of Input.
universal design (web accessibility)
Presentation transcript:

The Frontiers of HCI Jim Warren derived from lots of places and with thanks to Beryl Plimmer Haptics, Smell and Brain Interaction

Learning Outcomes Describe haptics in terms of – Human perception – Applications – Devices Describe application of eye tracking and visual gesture recognition Describe the exploration of – Olfactory detection and production – Brain wave detection

The Human Perceptual System Physical Aspects of Perception – Touch (tactile/cutaneous) Located in the skin, enables us to feel – Texture – Heat – Pain – Movement (kinesthetic/proprioceptive) The location of your body and its appendages The direction and speed of your movements 1-3

Physical Aspects of Perception Proprioception – We use sensation from our joints (e.g. their angles) and our muscles (e.g. strain) to determine the position of our limbs and perceive body position Combine with vestibular system (inner ear, balance) to perceive motion, orientation and acceleration – This combination is sometimes called the kinaesthetic sense. 1-4

Mobile devices Phone output – Vibrate – silent alert. These can be used like earcons – different signals for different events Does your phone have different alerts? – Can you tell the difference? 1-5

Mobile devices Phone input – Touch screens See previous lecture – Accelerometer - shaking actions Inconsistent interactions, high error rates – Passive input GPS Altimeter, Temperature, Humidity Specialised fitness or medical monitors 1-6 Fitbit Flex with sleep tracker

Using Haptics in Interaction Design ImmersiveTouch™ high fidelity surgical simulators 1-7 3D view with hi-res graphics Realistic surgical instrument attached to force feedback controller

Using Haptics in Interaction Design Medical Uses – Surgeon controls ‘robot’ with zoomed view and automated enhancements over manual surgery (e.g. greater range of motion than human wrist, tremor reduction) Most famous is da Vinci Surgical System – Over 200,000 operations in 2012 – Mostly prostate, uterine and heart valve (i.e. delicate stuff) 1-8

Using Haptics in Interaction Design The GuideCane (Ulrich and Borenstein, 2001) 1-9

Force Feedback Displays – Manipulator Gloves 1-10 CyberForce CyberGraspCyberGlove II Motion capture Force feedback – can ‘hang your hand’ on a virtual steeringwheel Feel size and shape of virtual object

Desktop Haptic Devices – SensAble PHANTOM (now by GeoMagic) Closest to a commodity force feedback tool 1-11 PHANTOM Premium 6-degrees (3 translational, 3 rotational) McSig – Beryl’s work with visually impaired M&CFID= &CFTOKEN=

Eye tracking 1-12 GP3 eye tracker by Gazepoint Most consumer-friendly modern method uses infrared (IR) light reflected off of different parts of the eye to detect angle of gaze – 1 st -4 th Purkinje images are from outer and inner surfaces of cornea (1, 2) and outer and inner surfaces of lens (3, 4) – Measures of these angles from multiple locations, combined with measure of head position, allow estimation of gaze point on screen* * Chi Jian-nan ; Zhang Peng-yi ; Zheng Si-yi ; Zhang Chuang ; Huang Ying, Key Techniques of Eye Gaze Tracking Based on Pupil Corneal Reflection, IEEE Intelligent Systems, 2009

Eye tracking applications Using eye tracking to estimate gaze point over time provides rich insight into how users consume a visual presentation – E.g. what do they look at on a Web page and for how long? Rather more limited as an input method – Careful control of eye gaze to act as pointer can result in eye strain* – But still useful for people with disabilities – Can use dwell time to indicate click, but error-prone Room for further research in combining with other input methods 1-13 * edc.eng.cam.ac.uk/~pb400/Papers/4_pbiswas_JAT11a.pdf

Visual gesture recognition OpenKinect – sort of pirate community making API to use Xbox Kinect hardware widely Makes skeleton from video and learns gestures for control – OK with broad gestures and good contrast to background – Kinect reasonably well received as a video game enhancer – Other apps being explored (e.g. surgeon to work computer with sterile hands)

Olfactory - Odour/ Smell Smell is essentially our ability to detect specific chemical particles in the air We can detect about 4000 different smells And they can be combined in millions of different ways Smell is very deep in our animal brain 1-15

Smell – current research Using sounds and smell signatures to aid recall of and affinity with individuals A wire in the glasses heats 8 perfumes to release a scent Yongsoon Choi, Rahul Parsani, Xavier Roman, Anshul Vikram Pandey, and Adrian David Cheok Sound perfume: building positive impression during face-to-face communication. In SIGGRAPH Asia 2012 Emerging Technologies (SA '12). ACM, New York, NY, USA,, Article 22, 3 pages. DOI= / Read smell.html#.Uz4nzI3HlSFhttp:// smell.html#.Uz4nzI3HlSF See video:

Technology of Odour Input – Detecting particular chemicals is possible Drug/ explosive sniffers – Detecting the range of smells in anything like human terms is extremely difficult task Output – Manufacturing particular smells possible (e.g. ‘freshly baked cookies’ – Active generation of a range of smells very difficult, but choosing a single smell to assert branding and positive association for a retail outlet or such is already done (see as-hard-nosed-traders-discover/) as-hard-nosed-traders-discover/ – Actually not that different than the conventional use of perfume to create an almost-subliminal association for one’s partner – Also similar to branding with corporate colours 1-17

Brain Computer Interaction Detecting the brain waves and interpreting From outside the skull – not very accurate Inside the skull – accurate but invasive

Reading nerve signals from brain to muscles Application: Motor Disabilities – HAL-5 (Hybrid Assistive Limb), CYBERDYNE Inc

1-20

EEG and Visually Evoked Potentials (VEP) Electroencephalography (EEG) is the recording of electrical activity along the scalp Patterns (in shape, or as strobing) coming into the eye can translate to measurable signals on the EEG (VEP) – However, there are many sources of noise, including blinking – And it’s not a rapid-response thing (usually analyse period ms following onset of visual stimulus) – See

Summary Describe haptics in terms of – Human perception Touch, proprioception, kinaesthetics – Applications Surgery (training or actual), assistive technology Describe applications of eye tracking and visual gesture recognition – Eye tracking: user studies, assistive; Visual gesture: games Describe the exploration of – Olfactory detection and production Detection of specific chemicals possible Production of a limited range of scents – Brain wave detection Awkward set up and use through EEG, but a boon to those who need it Fairly limited interaction without surgery