Sketch­based interface on a handheld augmented reality system Rhys Moyne Honours Minor Thesis Supervisor: Dr. Christian Sandor.

Slides:



Advertisements
Similar presentations
ARCHEOGUIDE Augmented Reality-based Cultural Heritage On-site Guide
Advertisements

A Natural Interactive Game By Zak Wilson. Background This project was my second year group project at University and I have chosen it to present as it.
C1 - The Impact of CAD on the Design Process.  Consider CAD drawing, 2D, 3D, rendering and different types of modelling.
Hand Gesture for Taking Self Portrait Shaowei Chu and Jiro Tanaka University of Tsukuba Japan 12th July 15 minutes talk.
The Escritoire: a personal projected display for interacting with documents Mark Ashdown Peter Robinson University of.
Page 1 SIXTH SENSE TECHNOLOGY Presented by: KIRTI AGGARWAL 2K7-MRCE-CS-035.
KEVAL A JETHI NIKHIL PADHIYAR TY CE-A.  Seeing  Feeling  Smelling  Tasting  Hearing.
Richard Yu.  Present view of the world that is: Enhanced by computers Mix real and virtual sensory input  Most common AR is visual Mixed reality virtual.
Augmented Reality David Johnson.
Virtual Dart: An Augmented Reality Game on Mobile Device Supervisor: Professor Michael R. Lyu Prepared by: Lai Chung Sum Siu Ho Tung.
Active Capture and Folk Computing Ana Ramírez and Marc Davis ICME 2004 – Taipei, Taiwan 29 June 2004 UC Berkeley - Garage Cinema Research - Group for User.
Invited Talk Telepresence in the Real World Presented by: Weihong Li --- ACM Multimedia 2004 Conference Workshop on Effective Telepresence (ETP’04) Duffie.
Using Tweek to Create Graphical User Interfaces in Virtual Reality Patrick Hartling IEEE VR 2003.
- List of Multimodal Libraries - (UniFr students only)
FYP Project LYU0301: Secure and Reliable PDA-Based Communication System.
FYP Project LYU0303: 1 Video Object Tracking and Replacement for Post TV Production.
Graphical User Interfaces in Virtual Reality Patrick Hartling Virtual Reality Applications Center IEEE VR 2002.
© 2010 Pearson Addison-Wesley. All rights reserved. Addison Wesley is an imprint of 1-1 HCI Human Computer Interaction Week 10.
MACHINE VISION GROUP Multimodal sensing-based camera applications Miguel Bordallo 1, Jari Hannuksela 1, Olli Silvén 1 and Markku Vehviläinen 2 1 University.
WUW - Wear Ur World - A Wearable Gestural Interface Joshua Latvatalo.
“S ixth Sense is a wearable gestural interface device that augments the physical world with digital information and lets people use natural hand gestures.
Video Eyewear for Augmented Reality Presenter: Manjul Sharma Supervisor: Paul Calder.
Human Computer Interaction 0. Preface National Chiao Tung Univ, Taiwan By: I-Chen Lin, Assistant Professor.
An Investigation into Immersive Visualization Vanessa Gertman.
Multimedia Specification Design and Production 2013 / Semester 2 / week 8 Lecturer: Dr. Nikos Gazepidis
James Walsh Supervisor: Prof. Bruce Thomas Wearable Computer Lab School of Computer and Information.
Presentation by: K.G.P.Srikanth. CONTENTS  Introduction  Components  Working  Applications.
GENERAL PRESENTATION SUBMITTED BY:- Neeraj Dhiman.
Augmented and mixed reality (AR & MR)
Submitted by:- Vinay kr. Gupta Computer Sci. & Engg. 4 th year.
Thuong Hoang Supervisor: Prof. Bruce Thomas Wearable Computer Lab School of Computer and Information Science In-situ Model Refinement and Hands-free Interaction.
Speaker: Shau-Shiang Hung( 洪紹祥 ) advisor :Shu-Chen Cheng( 鄭淑真 ) Date : 2010/4/8 Computer Graphics and Applications, IEEE Publication Date : March-April.
Jessica Tsimeris Supervisor: Bruce Thomas Wearable Computer Lab
GENESIS OF VIRTUAL REALITY  The term ‘Virtual reality’ (VR) was initially coined by Jaron Lanier, founder of VPL Research (1989)..
南台科技大學 資訊工程系 Through-Walls Collaboration Adviser: Yu-Chiang Li Speaker: Gung-Shian Lin Date: 2010/04/08 Pervasive Computing, IEEE Volume 8, Issue 3, July-Sept.
Video Eyewear for Augmented Reality Presenter: Manjul Sharma Supervisor: Paul Calder.
WIEN Building the Augmented Classroom Components for everyday mobile collaborative AR.
2.03 Explore virtual reality design and use.
1 Paper (low-fidelity) Prototypes. 2 When we are designing When we are thinking about design We are visualizing the interface inside our head (imaginary)
W E L C O M E. A U G M E N T E D R E A L I T Y A SEMINAR BY JEFFREY J M EC7A ROLL NO:
FYP Project LYU0304: “Monster Battle”: A Prototype of Augmented Reality Card Game.
Augmented Reality Authorized By: Miss.Trupti Pardeshi. NDMVP, Comp Dept. Augmented Reality 1/ 23.
Pervasive Gaming with Mobile Devices Prepared By: Karnung Liang Project Supervisor: Dr Brett Wilkinson.
Gesture Modeling Improving Spatial Recognition in Architectural Design Process Chih-Pin Hsiao Georgia Institute of Technology.
Gesture Input and Gesture Recognition Algorithms.
Virtual Image Peephole By Kyle Patience Supervisor: Reg Dodds Co Supervisor: Mehrdad Ghaziasgar.
Spring 2007 COMP TUI 1 Computer Vision for Tangible User Interfaces.
CONTENT FOCUS FOCUS INTRODUCTION INTRODUCTION COMPONENTS COMPONENTS TYPES OF GESTURES TYPES OF GESTURES ADVANTAGES ADVANTAGES CHALLENGES CHALLENGES REFERENCE.
Haris Ali (15) Abdul Ghafoor (01) Kashif Zafar (27)
Presented by : P L N GANESH CH DURGA PRASAD M RAVI TEJA 08551A A A0446.
Matthew McDonald Supervisors: Bruce H. Thomas & Ross T. Smith.
3D Modeling with the Tinmith Mobile Outdoor Augmented Reality System Editors: Lawrence Rosenblum and Simon Julier.
Outline Introduction Related Work System Overview Methodology Experiment Conclusion and Future Work.
Mixed Reality Conferencing Hirokazu Kato, Mark Billinghurst HIT Lab., University of Washington.
Content Introduction History Why choose Components Working Applications Advantages Disadvantages Future scope Conclusion.
What is Multimedia Anyway? David Millard and Paul Lewis.
An Augmented Reality iOS Application for Music Education Presenter: Kristen Brown.
A Framework for Perceptual Studies in Photorealistic Augmented Reality Martin Knecht 1, Andreas Dünser 2, Christoph Traxler 1, Michael Wimmer 1 and Raphael.
Nosipho Masilela COSC 480.  Define Augmented Reality  Augmented Reality vs. Reality  History of AR and its Applications  Augmented Tracking  Future.
Tatyana Povalyaeva Important Aspects of the Annotation of Data in Virtual Environments Seminar
Gesture Input and Gesture Recognition Algorithms
Hand Gestures Based Applications
SIXTH SENSE TECHNOLOGY
Jun Shimamura, Naokazu Yokoya, Haruo Takemura and Kazumasa Yamazawa
University of Texas Mobile Library Search Application Architecture
What is Augmented Reality?
Graphical User Interface Based Digital Sixth Sense
A Tangible Model Augmented Reality Application For Structural Biology
– Graphics and Visualization
Presentation transcript:

Sketch­based interface on a handheld augmented reality system Rhys Moyne Honours Minor Thesis Supervisor: Dr. Christian Sandor

TINT

Augmented Reality (AR) Adding virtual information to the real world Aids user in understanding the world Merges real world (camera image) and virtual objects Tracking is needed to calculate location of objects and user in the environment

AR Display Technologies Head-mounted Handheld

Motivation Handheld AR systems require different ways of interaction due to size constraint Exisiting methods such as pinch gloves or wrist pads are not suitable Future AR devices such as mobile phones are likely to have touch screens Sketch-based input in handheld augmented reality is largely unexplored

Direct Manipulation and Sketch- based Interface

Direct Manipulation

User actions affect object immediately Example: driving a car User turns steering wheel left and wheel moves left No command “TURN LEFT” “Matching user's gestures with the observed virtual motion” (Dragicevic et al., p. 2)

Sketch-based interface

Allows user to directly interact similar to pen and paper More natural Has been explored in areas such as modelling, animation, user interface prototyping Gestures

Research Question Is a sketch­based interface a suitable interaction method in a handheld augmented reality system?

Research Approach Literature Review Iterative prototypes Informal qualitative feedback on techniques (User study)

Development Approach Create a demo illustrating interactive exploration in a simple test scene Make use of TINT framework  Add tracking  Add animation Direct manipulation interface (complete by 18 September)  More complicated physics (18 September)  Gesture Recognition (18 September) Port to Mobile Augmented Reality (30 September) Projection of camera image onto virtual objects

Demo This can be used as a base to control visualisations in TINT Illustrates how to control objects to explore

TINT (This is not TINMITH) TINMITH – HMD prototyping platform TINT – Handheld augmented reality prototyping platform Used to prototype AR applications that may be possible on future mobile phones Written in Python allowing fast development

Implementation

ARToolkit Tracking Calculates where the marker is relative to the camera position Uses computer vision techniques to identify marker Allows the virtual objects to be placed in correct location

Compiz Physics

$1 Gesture Recognizer

Summary How to interactively control visualisations in handheld augmented reality?

Thanks, Questions?

References Compiz.org, 2009, 'Compiz',, accessed 3 September Dragicevic, P, Ramos, G, Bibliowitcz, J, Nowrouzezahrai, D, Balakrishnan, R & Singh, K 2008, 'Video browsing by direct manipulation', in Twenty­ sixth annual SIGCHI conference on Human factors in computing systems, ACM, Florence, Italy, pp. 237­246. Igarashi, T, Matsuoka, S & Tanaka, H 1999, 'Teddy: a sketching interface for 3D freeform design' in Proceedings of the 26th annual conference on computer graphics and interactive techniques, ACM, pp. 409­416. Kato, H & Billinghurst, M 1999, 'Marker tracking and HMD calibration for a video­based augmented reality conferencing system', in Proceedings of Augmented Reality 1999 (IWAR '99), IEEE, San Francisco, CA, pp. 85­94. Piekarski, W & Thomas, BH 2001, 'Tinmith­Metro: new outdoor techniques for creating city models with an augmented reality wearable computer', in Proceedings of Fifth International Symposium on Wearable Computers, IEEE, Zurich, pp. 31­ 38. Sandor, C, Cunningham, A, Eck, U, Urquhart, D, Jarvis, D, Dey, A, Barbier, S, Marner, M & Rhee, S 2009, 'Egocentric space­distorting visualizations for rapid environment exploration in mobile mixed reality', in 8th IEEE and ACM International Symposium on Mixed and Augmented Reality (ISMAR'09), ACM, Orlando, Florida. Shneiderman, B 1983, 'Direct Manipulation: a step beyond programming languages', Computer, vol. 16, no. 8, pp sleepygeek.org,n.d., '$1 gesture recognizer in python',, accessed 3 September Wobbrock, JO, Wilson, AD & Li, Y 2007, 'Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes' in Proceedings of the 20th annual ACM symposium on user interface software and technology, ACM, Newport, Rhode Island, pp. 159­168.