Partner presentation COGAIN camp 2007 5 th September, De Montfort University, Leicester, UK Faculty of Mathmatics and Natural Sciences Institute of Psychology.

Slides:



Advertisements
Similar presentations
Agenda Introduction to eye tracking Whats new about it Examples Case studies Q&A.
Advertisements

Cross-modal perception of motion- based visual-haptic stimuli Ian Oakley & Sile OModhrain Palpable Machines Research Group
Addressing Patient Motivation In Virtual Reality Based Neurocognitive Rehabilitation A.S.Panic - M.Sc. Media & Knowledge Engineering Specialization Man.
A Simulator Sickness Literature Review Michael A. Mollenhauer 12/19/2003.
Getting a Measure of Satisfaction from Eyetracking in Practice Workshop 24 Organisers: Tony Renshaw, Leeds Metropolitan University, UK Natalie Webb, Amberlight.
Chapter 6: Visual Attention. Scanning a Scene Visual scanning – looking from place to place –Fixation –Saccadic eye movement Overt attention involves.
Chapter 6: Visual Attention. Scanning a Scene Visual scanning – looking from place to place –Fixation –Saccadic eye movement Overt attention involves.
Gaze vs. Mouse in Games: The Effects on User Experience Tersia //Gowases, Roman Bednarik, Markku Tukiainen Department of Computer Science and Statistics.
University of Minho School of Engineering Algoritmi Centre Uma Escola a Reinventar o Futuro – Semana da Escola de Engenharia - 24 a 27 de Outubro de 2011.
The Use of Eye Tracking Technology in the Evaluation of e-Learning: A Feasibility Study Dr Peter Eachus University of Salford.
Comparison of two eye tracking devices used on printed images Barbora Komínková The Norwegian Color Research Laboratory Faculty of Computer Science and.
MODULAR TE Valorisation Conference Dr.-Ing. Juergen Wehling Faculty for Engineering Sciences, Department of Technology (Lectureship) TUD, Prof. Dr.-Ing.
Virtual Reality. What is virtual reality? a way to visualise, manipulate, and interact with a virtual environment visualise the computer generates visual,
Transponder Beacons Control Box Head Mounted Display Inertial Sensor The IS600 Tracking System Ultrasonic Coordinate Base.
Lecture 4: Perception and Cognition in Immersive Virtual Environments Dr. Xiangyu WANG.
Non-invasive Techniques for Human Fatigue Monitoring Qiang Ji Dept. of Electrical, Computer, and Systems Engineering Rensselaer Polytechnic Institute
Social Interaction Lab Rensselaer Polytechnic Institute Led By: Prof. Mei Si Cognitive Science Department Games and Simulation Art and Science Program.
The Eye-Tracking Butterfly: Morphing the SMI REDpt Eye-Tracking Camera into an Interactive Device. James Cunningham & James D. Miles California State University,
1. Human – the end-user of a program – the others in the organization Computer – the machine the program runs on – often split between clients & servers.
Visually Coupled Systems Hardware and the Human Interface Presented By Padmashri & Sogra.
SIDGrid The Social Informatics Data Grid Mark Hereld Computation Institute Argonne National Laboratory & University of Chicago.
Seeing is believing: Using systematic observations in user experience research. March 11 th, 2011 Bram van Mil Sales consultant.
Virtual Humanoid “Utsushiomi” Michihiko SHOJI Venture Business Laboratories, Yokohama National University
Jeanne Corcoran, OTD, OTR/L October 6 th, The mission of Envision Center for Data Perceptualization is to serve, support, and collaborate with faculty,
Comparing Experts and Novices In Solving Electrical Circuit Problems With the Help of Eye-Tracking David Rosengrant, Colin Thomson & Taha Mzoughi Department.
Human Factor Evaluation for Marine Education by using Neuroscience Tools N. Νikitakos D. Papachristos Professor Ph.d. candidate Dept. Shipping, Trade and.
Real-time interactions between attention and behavior in multimedia learning environments Susan Letourneau Postdoctoral Fellow, CREATE Lab NYU & CUNY Graduate.
 An eye tracking system records how the eyes move when a subject is sitting in front of a computer screen.  The human eyes are constantly moving until.
Matt Waldersen TJ Strzelecki Rick Schuman Krishna Jhajaria (Presenting)
The effects of relevance of on-screen information on gaze behaviour and communication in 3-party groups Emma L Clayes University of Glasgow Supervisor:
TAUCHI – Tampere Unit for Computer-Human Interaction Visualizing gaze path for analysis Oleg Špakov MUMIN workshop 2002, Tampere.
Integration Session. Presentations from… POLITO ITU and UPNA.
“Low Level” Intelligence for “Low Level” Character Animation Damián Isla Bungie Studios Microsoft Corp. Bruce Blumberg Synthetic Characters MIT Media Lab.
Tokyo Institute of Technology (Tokyo Tech.) For the collaboration among Centres of Excellence in Europe and Asia.
Virtual Reality Lecture2. Some VR Systems & Applications 고려대학교 그래픽스 연구실.
Encuentro/Rencontre/Meeting VR and IG Girona, Spain, Dec 19-20, 2007 An INRIA Project-team in partnership with four other institutions Stéphane Donikian.
Eye-Based Interaction in Graphical Systems: Theory & Practice Part III Potential Gaze-Contingent Applications.
Visual Perception, Attention & Action. Anthony J Greene2.
CENG 394 Introduction to Human-Computer Interaction
The Effect of Interface on Social Action in Online Virtual Worlds Anthony Steed Department of Computer Science University College London.
Chapter 1 Section 4 Contemporary Perspectives. Objectives Describe the seven main contemporary perspectives in psychology.
1 1. Representing and Parameterizing Agent Behaviors Jan Allbeck and Norm Badler 연세대학교 컴퓨터과학과 로봇 공학 특강 학기 유 지 오.
Introduction Processing of configurational information is often highly affected by inversion Previous research has focused on the perception of static.
- New solutions for attention stimulation, STIMULATE - Prof. Bozena Kostek, Gdansk University of Technology, Poland - Mass educational and wellbeing markets.
Virtual Characters. Overview What is a digital character? What is a digital character? Why do would we want digital characters? Why do would we want digital.
The geometry of the system consisting of the hyperbolic mirror and the CCD camera is shown to the right. The points on the mirror surface can be expressed.
Week 2-1: Human Information Processing
Experimental Humanities lab: Welcome!. HUME lab  What?  Where? (When?)  Why? Annual symposium on Applying of principles of cognitive psychology in.
Spatio-temporal saliency model to predict eye movements in video free viewing Gipsa-lab, Grenoble Département Images et Signal CNRS, UMR 5216 S. Marat,
1 Partner presentation COGAIN camp 05 September 2007 Prof Alastair Gale, Dr Fangmin Shi Applied Vision Research Centre Loughborough University, UK.
University of Derby (UoD) Education, Health & Sciences Arts, Design & Technology Derbyshire Business School UoD College Buxton Business Development Unit.
Infant Perception of Object-Affect Relations Mariana Vaillant-Molina and Lorraine E. Bahrick Florida International University Presented at the Society.
P15051: Robotic Eye Project Definition Review TIM O’HEARNANDREW DROGALISJORGE GONZALEZ KATIE HARDY DANIEL WEBSTER.
8. What are the advantages and disadvantages of using a virtual reality environment to study the brain and behavior? 9.Give examples of the way that virtual.
HFE 760 Virtual Environments Winter 2000 Jennie J. Gallimore
Real Time Collaboration and Sharing
Towards an express-diagnostics for level of processing and hazard perception Boris M. Velichkovsky et al. Transportation Research Part F 5 (2002)
WP6 Emotion in Interaction Embodied Conversational Agents WP6 core task: describe an interactive ECA system with capabilities beyond those of present day.
Lab 2 Issues: Needed to adapt to the “normal environment”. We would have liked to see more rapid adjustment and a stable baseline. Most subjects adapted.
Human Joint Transportation in a Multi-User Virtual Environment Stephan Streuber Astros.
Current Projects Planned Projects UTA Partner Presentation.
1 INTRODUCTION TO COMPUTER GRAPHICS. Computer Graphics The computer is an information processing machine. It is a tool for storing, manipulating and correlating.
Investigating the Use of Eye-Tracking Technology for Assessment: A case study of research and innovation at a Special School INNOVATION IN THE ASSESSMENT.
WP 6: Analysis and evaluation Hans H. K. Andersen Cogain Kick-Off Tampere, Finland.
Perceptive Computing Democracy Communism Architecture The Steam Engine WheelFire Zero Domestication Iron Ships Electricity The Vacuum tube E=mc 2 The.
Introduction to Virtual Environments & Virtual Reality
Howell Istance Ambient Assisted Living Group
Fundamentals of Human Computer Interaction (HCI)
Chapter 7 - Visual Attention
Eye-Based Interaction in Graphical Systems: Theory & Practice
Presentation transcript:

Partner presentation COGAIN camp th September, De Montfort University, Leicester, UK Faculty of Mathmatics and Natural Sciences Institute of Psychology III Applied Cognitive Research Unit Leader: Prof. Boris M. Velichkovsky Applied Cognitive Research Unit TU-Dresden

Neurocognitive investigations of human eye movements: Interaction between levels Applied Cognitive Research Unit Professor:Boris M. Velichkovsky Lab Leader:Dr. Sebastian Pannasch Members:Sven-Thomas Graupner Jens Helmert Michael Heubner Robert Lange Johannes Marx Romy Müller Fiona Mulvey Franziska Schrammel Sascha Weber

Neurocognitive investigations of human eye movements: Interaction between levels Current Research Interests TWO VISUAL SYSTEMS AND THEIR EYE MOVEMENTS Parameters of eye movements as a means of identifying processing in ambient and focal modes. Modes of processing in static and dynamic stimuli. two visual systems 1. Theoretical and Basic Research

Neurocognitive investigations of human eye movements: Interaction between levels Functional differences of fixations 1. Ambient Processing is related to simple visual properties such as motion, contrast and location Processing is characterized by short fixations within long saccades 2. Focal Processing is responsible for object identification and its semantic categorisation Processing is characterised by long fixations within short saccades

Neurocognitive investigations of human eye movements: Interaction between levels PERCEPT –European NEST PATHFINDER project (led by our team) Objectification of how we perceive complex visual material. Integrate information from fMRI, NIRS, EEG/ERP and MEG and new approaches in parallel eye movements analysis using EFRP and FIBER techniques. Attentive, intentional and emotional states are visualised as an interpretation map.

Neurocognitive investigations of human eye movements: Interaction between levels visual distractors - visual change leads to a prolongation of fixation duration Distractor effect – occulomotor reflex or orienting response?

Neurocognitive investigations of human eye movements: Interaction between levels Eye Movements and Social Interaction Virtual characters are capable of inducing physiological reactions and emotional experience in human observers. Visual attention was more intensely allocated to direct gaze and potentially threat-relevant facial configurations. Relationship between the character’s expression, the observers facial movements and their subjective feeling suggests an empathic response.

Neurocognitive investigations of human eye movements: Interaction between levels Investigation of individual differences and eye movements Individual differences in eye movements in the normal population Eye movements of clinical groups, Williams Syndrome. Levels of processing in Williams Syndrome. Physiological evidence of two visual systems and their eye movements.

Neurocognitive investigations of human eye movements: Interaction between levels 2. Practical Applications of Eye Tracking Therapeutic applications; COGAIN, Amblyopia. Amblyopia; In conjunction with AIF. –using eyetracking while viewing a 3D environment –Shutter glasses to present visual information selectively to each eye –Therapeutic effect for sufferers of Amblyopia

Neurocognitive investigations of human eye movements: Interaction between levels Industrial Partners BMW – Investigation of attention in dynamic environments; attention and hazardous events. Daimler-Chrysler – virtual environment for video conferencing Daimler Chrysler

Neurocognitive investigations of human eye movements: Interaction between levels Are there changes in eye movement parameters relative to hazard events? Two visual systems in dynamic environments

Neurocognitive investigations of human eye movements: Interaction between levels Videoconferencing Gaze based selection Joint attention Daimler Chrysler

Neurocognitive investigations of human eye movements: Interaction between levels ZOOM Future Optical Microsystems Development of a gaze controlled, see-through Head Mounted Display based on a CMOS-OLED Interface - HMD by Microvision and the EyeLink I eye tracker Development of a simulation system integrating commercially available HMD and Eye Tracker specify hardware and software requirements of gaze controlled HMD and assess usability

Neurocognitive investigations of human eye movements: Interaction between levels Collaborative Research Possibilities We are open to collaboration on all basic research in the theoretical aspects of attention and cognition both in the normal population and in clinical groups. We also welcome collaboration on practical applications of eyetracking technology, particularly to usability and human-computer interfaces. We particularly invite COGAIN partners to collaborate with us on the establishment of a TC within CIE to investigate safety issues in eyetracking.

Neurocognitive investigations of human eye movements: Interaction between levels Thanks for your attention!