Multi-device Organic 3D Sculpting through Natural User Interface Gestures BSci Honours - Bradley Wesson Supervisor – Brett Wilkinson.

Slides:



Advertisements
Similar presentations
Use it Free: Instantly Knowing Your Phone Attitude Pengfei Zhou*, Mo Li Nanyang Technological University Guobin (Jacky) Shen Microsoft Research.
Advertisements

INTERACTING WITH SIMULATION ENVIRONMENTS THROUGH THE KINECT Fayez Alazmi Supervisor: Dr. Brett Wilkinson Flinders University Image 1Image 2Image 3 Source.
SNOUT: One-Handed use of Capacitive Touch Devices Adam Zarek, Daniel Wigdor, Karan Singh University of Toronto.
QUASID – Measuring Interaction Techniques Karin Nieuwenhuizen.
Part 4: Evaluation Chapter 20: Why evaluate? Chapter 21: Deciding on what to evaluate: the strategy Chapter 22: Planning who, what, where, and when Chapter.
Tracking Gökhan Tekkaya Gürkan Vural Can Eroğul. Outline Tracking –Overview –Head Tracking –Eye Tracking –Finger/Hand Tracking Demos.
Haptic: Image: Audio: Text: Landmark: YesNo YesNo YesNo YesNo YesNo Haptic technology, or haptics, is a tactile feedback technology that takes advantage.
A Navigation System for Increasing the Autonomy and the Security of Powered Wheelchairs S. Fioretti, T. Leo, and S.Longhi yhseo, AIMM lab.
Discussion Silvia Lindtner INF 132 April 07. Fitts’ law - recap A predictive model of time to point at an object Help decide the location and size of.
Session 1.1. Windows Phone Topics Session 1.1 Windows Phone The Windows Phone Device.
Applying Eye Tracking for Usability Evaluations of E-Commerce sites Ekaterini Tzanidou, Shailey Minocha, Marian Petre Department of Computing, The Open.
MACHINE VISION GROUP Multimodal sensing-based camera applications Miguel Bordallo 1, Jari Hannuksela 1, Olli Silvén 1 and Markku Vehviläinen 2 1 University.
Amarino:a toolkit for the rapid prototyping of mobile ubiquitous computing Bonifaz Kaufmann and Leah Buechley MIT Media Lab High-Low Tech Group Cambridge,
Presented by: Z.G. Huang May 04, 2011 Did You See Bob? Human Localization using Mobile Phones Romit Roy Choudhury Duke University Durham, NC, USA Ionut.
FYPs supervised by KH Wong1 FYP projects Supervisor : K.H. Wong See Projects under my webpage or
Indoor Localization Carick Wienke Advisor: Dr. Nicholas Kirsch University of New Hampshire ECE 791H Using a Modern Smartphone.
ENTERFACE ‘08: Project4 Design and Usability Issues for multimodal cues in Interface Design/ Virtual Environments eNTERFACE ‘08| Project 4.
Multi-interface gesture based organic modelling Bradley Wesson.
Using Mobile Phones To Write In Air
INTERACTING WITH SIMULATION ENVIRONMENTS THROUGH THE KINECT Fayez Alazmi Supervisor: Dr. Brett Wilkinson Flinders University Image 1Image 2 Source : 1.
Interaction Gavin Sim HCI Lecture /111. Aims of this lecture Last week focused on persona and scenario creation. This weeks aims are: ◦ To introduce.
Use of Eye Movement Gestures for Web Browsing Kevin Juang Frank Jasen Akshay Katrekar Joe Ahn.
ARHunter: A Multiplayer Game Using Gestural Input in a Location-Sensitive and Immersive Environment Koji YataniUniversity of Tokyo Masanori SugimotoUniversity.
Sketch­based interface on a handheld augmented reality system Rhys Moyne Honours Minor Thesis Supervisor: Dr. Christian Sandor.
Interaction techniques for post-WIMP interfaces Lawrence Sambrooks Supervisor: Dr Brett Wilkinson.
Visualizing Information in Global Networks in Real Time Design, Implementation, Usability Study.
Muscle Volume Analysis 3D reconstruction allows for accurate volume calculation Provides methods for monitoring disease progression Measure muscle atrophy.
Final Honours Presentation Principal Investigator: João Lourenço Supervisor: Dr Hannah Thinyane.
Welcome to Control ! Hi ! Styx Innov. What is Control ? Control is an android application which enables us to remotely control our PC via Wireless Fidelity.
SEMINAR REPORT ON K.SWATHI. INTRODUCTION Any automatically operated machine that functions in human like manner Any automatically operated machine that.
Class 02 – 03 Feb 2014 Setup Where do we begin? Know your content Discovering your target user.
Interaction techniques for post-WIMP interfaces Lawrence Sambrooks Supervisor: Dr Brett Wilkinson.
March 17, 2008Doc: IEEE Jean Schwoerer (France Telecom R&D) Slide1 Project: IEEE P Working Group for Wireless Personal Area.
Video Eyewear for Augmented Reality Presenter: Manjul Sharma Supervisor: Paul Calder.
App Inventor You are going to use App Inventor to make an application for your phone Smart Phone ‘s can hold many entertaining apps due to the amount of.
VIRTUAL REALITY PRESENTED BY, JANSIRANI.T, NIRMALA.S, II-ECE.
Srinivas Cheekati( ) Instructor: Dr. Dong-Chul Kim
CAPTURING, ENCODING AND RENDERING GESTURES USING THE KINECT by Adan Patience Supervisor: Mr Mehrdad Ghazi-Asgar.
User Performance in Relation to 3D Input Device Design  Studies conducted at University of Toronto  Usability review of 6 degree of freedom (DOF) input.
Virtual Desktop Peephole By Kyle Patience Supervisor: Reginald Dodds Co Supervisor: Mehrdad Ghaziasgar.
Team Members Ming-Chun Chang Lungisa Matshoba Steven Preston Supervisors Dr James Gain Dr Patrick Marais.
Virtual Image Peephole By Kyle Patience Supervisor: Reg Dodds Co Supervisor: Mehrdad Ghaziasgar.
Sensor Fusion Donald Heer 11/10/10. The Questions Can two things happen at the ‘same’ time? Can the same thing be observed ‘identically’ by two different.
3D Modeling with the Tinmith Mobile Outdoor Augmented Reality System Editors: Lawrence Rosenblum and Simon Julier.
G2 Presentation Week 12 Rehearsal. Outline Demo User Testing Internal External Improvements Project Management.
SONGONUGA EMILIA ACCOUNTING 12/SMS02/ Introduction One goal of human-computer interaction research is to reduce the demands on users when using.
Expressive Intelligence Studio // Center for Games and Playable Media // 3D User Interfaces Using the Kinect.
Gestures and Device Motion. Introduction Everyone has their own way of picking up and putting down their device Holding and using the device can determine.
Lens Gestures: Integrating Compound Gesture Inputs for Shortcut Activation.
CHAPTER 8 Sensors and Camera. Chapter objectives: Understand Motion Sensors, Environmental Sensors and Positional Sensors Learn how to acquire measurement.
Chapter 8 Sensors and Camera. Figure 08.01: The Accelerometer can gauge the orientation of a stationary device.
UWave: Accelerometer-based personalized gesture recognition and its applications Tae-min Hwang.
TOUCHLESS TOUCHSCREEN USER INTERFACE
Characteristics of Graphical and Web User Interfaces
The Successful Website
FIN RING Presented By: Jabir T.
3DUI – Submission #120 LOP-cursor Fast and Precise Interaction with Tiled Displays Using One Hand and Levels of Precision Henrique Debarba, Luciana Nedel,Anderson.
Augmented Reality And Virtual Reality.
CAPTURING OF MOVEMENT DURING MUSIC PERFORMANCE
TOUCHLESS TOUCHSCREEN USER INTERFACE
F-Pointer: Prototype testing of the finger-manipulated device
NBKeyboard: An Arm-based Word-gesture keyboard
CS Simulating Mouse Pointing Using Kinect & Smartwatch
Small, Lightweight Speed and Distance Sensor
Wearable Devices. Wearable Devices Wearable Interfaces Wearable interfaces are the interfaces used to interact with wearable devices while they.
A glove that touch the future By jia ming feng
Android Topics Sensors Accelerometer and the Coordinate System
Small, Lightweight Speed and Distance Sensor
Characteristics of Graphical and Web User Interfaces
Presentation transcript:

Multi-device Organic 3D Sculpting through Natural User Interface Gestures BSci Honours - Bradley Wesson Supervisor – Brett Wilkinson

Research focus Investigate an efficient way to navigate and manipulate a 3D scene using natural user interface gestures

Product – Organic 3D Sculptor Real-life analogue  Familiar interaction modes  Transferrable skills  Pick up and play

Prior Research Natural User Interfaces and Multi-Interface systems:  JerkTilts: using accelerometers for eight-choice selection on mobile devices.  Grab-carry-release: manipulating physical objects in a real scene through a smart phone.  Pointable: an in-air pointing technique to manipulate out-of-reach targets on tabletops  Virtual sensors: rapid prototyping of ubiquitous interaction with a mobile phone and a Kinect. Technical research: Polygonal Marching Cubes Meta-balls

Devices and Interfaces  Kinect  Skeletal tracking  Phone  Hand orientation  Direct input through touch-screen  Viewport into 3D world “using a stylus is faster and more accurate than using finger touch” An Investigation of Finger versus Stylus Input in Medical Scenarios

Implementation  Sensor Fusion [1]  Use Accelerometer, Gyroscope and Compass together  Twitch movements of phone, drift correction from Kinect  Oct-tree of Voxels  Volume Pixels – great for complex dynamic models  Variable level of detail  GPU-raytraced  Well suited to voxels  Efficient skipping of empty oct-tree nodes 1. Sensor Fusion on Android Devices: A Revolution in Motion Processing (33:20) Figure 2: Metaballs Figure 1: Oct-tree

Outcomes  Qualitative measures  Model quality  User comfort  User evaluation  Quantitative measures  Speed  Errors made  NASA Task load index (TLX)  Accuracy

Success/Failure Criteria  An accurate and natural way to manipulate 3D scenes  More accurate and less laggy  Exploration of the complexity of the interaction techniques – does the combination of orientation, touch, gestural, and voice interactions provide greater control of an application or is it too complex to prove usable

Demo

Timeline

Fall-back Require users to orient and position a pre-generated 3D model in a scene  Kinect gestures  Two hands to rotate, or dolly  Kinect+Phone gestures  Model attached to hand node, oriented with phone

Questions Multi-device Organic 3D Sculpting through Natural User Interface Gestures