Virtual Image Peephole By Kyle Patience Supervisor: Reg Dodds Co Supervisor: Mehrdad Ghaziasgar.

Slides:



Advertisements
Similar presentations
A Natural Interactive Game By Zak Wilson. Background This project was my second year group project at University and I have chosen it to present as it.
Advertisements

T1.1- Analysis of acceleration opportunities and virtualization requirements in industrial applications Bologna, April 2012 UNIBO.
Page 1 SIXTH SENSE TECHNOLOGY Presented by: KIRTI AGGARWAL 2K7-MRCE-CS-035.
Joint Eye Tracking and Head Pose Estimation for Gaze Estimation
Computer Vision REU Week 2 Adam Kavanaugh. Video Canny Put canny into a loop in order to process multiple frames of a video sequence Put canny into a.
Virtual Dart: An Augmented Reality Game on Mobile Device Supervisor: Professor Michael R. Lyu Prepared by: Lai Chung Sum Siu Ho Tung.
LUCAS KANADE FEATURE TRACKER a pyramidal implementation
Motivation Where is my W-2 Form?. Video-based Tracking Camera view of the desk Camera Overhead video camera.
Visual motion Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys.
Viewpoint Tracking for 3D Display Systems A look at the system proposed by Yusuf Bediz, Gözde Bozdağı Akar.
Android An open handset alliance project Janice Garcia September 18, 2008 MIS 304.
REAL-TIME DETECTION AND TRACKING FOR AUGMENTED REALITY ON MOBILE PHONES Daniel Wagner, Member, IEEE, Gerhard Reitmayr, Member, IEEE, Alessandro Mulloni,
MACHINE VISION GROUP Multimodal sensing-based camera applications Miguel Bordallo 1, Jari Hannuksela 1, Olli Silvén 1 and Markku Vehviläinen 2 1 University.
Augmented Reality and 3D modelling Done by Stafford Joemat Supervised by Mr James Connan and Mr Mehrdad Ghaziasgar.
Feature Tracking and Optical Flow
By: Simon Kleinsmith Supervisor: Mr Mehrdad Ghaziasgar Co-supervisor: Mr James Connan.
Introduction Kinect for Xbox 360, referred to as Kinect, is developed by Microsoft, used in Xbox 360 video game console and Windows PCs peripheral equipment.
ELECTRONIC CONDUCTING SYSTEM Kenzo Abrahams Supervisor: Mehrdad Ghaziasgar Co-supervisor: James Connan Assisted by: Diego Mushfieldt.
PortableVision-based HCI A Hand Mouse System on Portable Devices 連矩鋒 (Burt C.F. Lien) Computer Science and Information Engineering Department National.
Presentation by: K.G.P.Srikanth. CONTENTS  Introduction  Components  Working  Applications.
Capturing, Encoding and Rendering Gestures using the Kinect Adan Patience Supervisor: Mr. Mehrdad Ghaziasgar Co-Supervisor: Mr. Reginald McDonald Dodds.
Face Recognition System By Arthur. Introduction  A facial recognition system is a computer application for automatically identifying or verifying a person.
Submitted by:- Vinay kr. Gupta Computer Sci. & Engg. 4 th year.
Project title : Automated Detection of Sign Language Patterns Faculty: Sudeep Sarkar, Barbara Loeding, Students: Sunita Nayak, Alan Yang Department of.
By: Hadley Scholtz Supervisor: Mehrdad Ghaziasgar Co - supervisor: James Connan Mentor: Ibraheem Frieslaar.
By: Hadley Scholtz Supervisor: Mehrdad Ghaziasgar Co - supervisor: James Connan Mentor: Ibraheem Frieslaar.
Electronic Conducting System Kenzo Abrahams Supervisor: Mehrdad Ghaziasgar Co-supervisor: James Connon Mentored by: Diego Mushfieldt.
CSE 185 Introduction to Computer Vision Feature Tracking and Optical Flow.
Visual motion Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys.
ECE 172A SIMPLE OBJECT DETECTOR WITH INDICATOR WHEN A NEW OBJECT HAS BEEN ADDED TO OR MISSING IN A ROOM Presented by by Hugo Groening.
By: Kenzo Abrahams Supervisor: Mehrdad Ghaziasgar Co-supervisor: James Connan Mentored by: Diego Mushfieldt.
Feature Reconstruction Using Lucas-Kanade Feature Tracking and Tomasi-Kanade Factorization EE7740 Project I Dr. Gunturk.
Anaglyph Videos Student:Jihaad Pienaar Supervisor: Mr Mehrdad Ghaziasgar Co-Supervisor:Mr James Connan Mentors: Mr Roland Foster & Mr Kenzo Abrahams.
Compiled and Presented by KJ Tsiri Supervisor : Mr. Ismail.
Jack Pinches INFO410 & INFO350 S INFORMATION SCIENCE Computer Vision I.
Student: Ibraheem Frieslaar Supervisor: Mehrdad Ghaziasgar.
Virtual Desktop Peephole By Kyle Patience Supervisor: Reginald Dodds Co Supervisor: Mehrdad Ghaziasgar.
Virtual Desktop Peephole By Kyle Patience Supervisor: Reginald Dodds Co Supervisor: Mehrdad Ghaziasgar.
Department of Computer Science,
CONTENT FOCUS FOCUS INTRODUCTION INTRODUCTION COMPONENTS COMPONENTS TYPES OF GESTURES TYPES OF GESTURES ADVANTAGES ADVANTAGES CHALLENGES CHALLENGES REFERENCE.
The Optical Telemeter Supervisor - Mr Reg Dodds Co – Supervisor - Mentor- Mr Dane Brown Presented by Mutende Msiska.
David Wild Supervisor: James Connan Rhodes University Computer Science Department Eye Tracking Using A Simple Webcamera.
Student: Ibraheem Frieslaar Supervisor: Mehrdad Ghaziasgar.
IEEE International Conference on Multimedia and Expo.
Motion Features for Action Recognition YeHao 3/11/2014.
Optical flow and keypoint tracking Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys.
Application of Stereo Vision in Tracking *This research is supported by NSF Grant No. CNS Opinions, findings, conclusions, or recommendations.
Window Shopper Presented By: Mohammad Harris Samee Solangi Maria Fatima Muhammad Saad Khan Supervised By : Dr. Muhammad Sarim.
Vision Based hand tracking for Interaction The 7th International Conference on Applications and Principles of Information Science (APIS2008) Dept. of Visual.
What is Multimedia Anyway? David Millard and Paul Lewis.
Anaglyph Videos Student:Jihaad Pienaar Supervisor: Mr Mehrdad Ghaziasgar Co-Supervisor:Mr James Connan Mentors: Mr Roland Foster & Mr Kenzo Abrahams.
Portable Camera-Based Assistive Text and Product Label Reading From Hand-Held Objects for Blind Persons.
Implementation of Real Time Image Processing System with FPGA and DSP Presented by M V Ganeswara Rao Co- author Dr. P Rajesh Kumar Co- author Dr. A Mallikarjuna.
Accelerometer based motion gestures for mobile devices Presented by – Neel Parikh Advisor Committee members Dr. Chris Pollett Dr. Robert Chun Dr. Mark.
OpenCV C++ Image Processing
Automatic License Plate Recognition for Electronic Payment system Chiu Wing Cheung d.
AHED Automatic Human Emotion Detection
Development of VR Glasses
Seunghui Cha1, Wookhyun Kim1
Customer Satisfaction Based on Voice
Video-based human motion recognition using 3D mocap data
What I learned in the first 2 weeks
AHED Automatic Human Emotion Detection
Coupled Horn-Schunck and Lukas-Kanade for image processing
Android Sensor Programming
AHED Automatic Human Emotion Detection
AHED Automatic Human Emotion Detection
Optical flow and keypoint tracking
FISH IDENTIFICATION SYSTEM
Report 2 Brandon Silva.
Presentation transcript:

Virtual Image Peephole By Kyle Patience Supervisor: Reg Dodds Co Supervisor: Mehrdad Ghaziasgar

Quick Recap Development of an interface that does the following: A very large virtual screen is imagined to exist, The mobile phone screen is used as a "peephole" into that interface.

Overview Design Decisions and System Changes Implementation Finding and Tracking Features User Interface Tools

Design Decisions and System Changes PREVIOSLYCURRENTLY No NDK Uses NDK Use Android camera API Uses device’s camera natively Didn’t use sensor readings Uses both accelerometer and gyroscope Display was modifies camera frames 3D interface

NDK Implementation Get framesProcess frames Get sensor readings Output

Finding and Tracking Features The Shi-Tomasi Corner Detector Based on the Harris Corner Detector. Tries to find little patches of image that generate a large variation when moved around. Ultimately finds small corners in a frame. Lucas & Kanade Method Technique which can provide an estimate of the movement of certain features in successive images of a scene. Algorithm makes a "best guess" of the displacement of a neighbourhood by looking at changes in pixel intensity. These intensities are known through intensity gradients of the image in that neighbourhood.

User Interface using camera frames Moving the device without tilt will move the interface

User Interface using gyroscope Tilting phone to the left Tilting phone to the right

User Interface using accelerometer Tilting phone upwards Tilting phone downwards

Tools PlatformWindows 8.1 x64 ApplicationsAndroid Studio SDKAndroid SDK OpenCV Android SDK Android NDK LibrariesOpenCV jPCT LanguagesJava C++

Project Plan Term 1 Learn OpenCV Learn Android Term 2 Learn NDK Build Android Prototype Capture Footage Term 3 Process Frames Process Accelerometer and Gyroscope Readings Create 3D Interface Intergrade all components Term 4 Implementing Tuning Testing

References Ali, S. I., Jain, S., Lal, B., & Sharma, N. (2012). A framework for modelling and designing of intelligent and adaptive interfaces for human computer interaction. International Journal of Applied Information Systems (IJAIS) Volume. Anuar, A., Saipullah, K. M., Ismail, N. A., & Soo, Y. (2011). OpenCV based real-time video processing using android smartphone. International Journal of Computer Technology and Electronics Engineering (IJCTEE), 1(3). Shi, J., & Tomasi, C. (1994, June). Good features to track. In Computer Vision and Pattern Recognition, Proceedings CVPR'94., 1994 IEEE Computer Society Conference on (pp ). IEEE. Tomasi, C., & Kanade, T. (1991). Detection and tracking of point features. Pittsburgh: School of Computer Science, Carnegie Mellon Univ..

Questions?