Electronic Conducting System Kenzo Abrahams Supervisor: Mehrdad Ghaziasgar Co-supervisor: James Connon Mentored by: Diego Mushfieldt.

Slides:



Advertisements
Similar presentations
Hand Gesture for Taking Self Portrait Shaowei Chu and Jiro Tanaka University of Tsukuba Japan 12th July 15 minutes talk.
Advertisements

NATHAN DE LA CRUZ SUPERVISOR: MEHRDAD GHAZIASGAR MENTORS: DANE BROWN AND DIEGO MUSHFIELDT Lie Detection System Using Facial Expressions.
Student: Ibraheem Frieslaar Supervisor: Mehrdad Ghaziasgar.
Student: Jihaad Pienaar Supervisor: Mr Mehrdad Ghaziasgar Co-Supervisor: Mr James Connan Mentors: Mr Roland Foster & Mr Kenzo Abrahams Anaglyph Videos.
Video Object Tracking and Replacement for Post TV Production LYU0303 Final Year Project Spring 2004.
Real-Time Face Detection and Tracking Using Multiple Cameras RIT Computer Engineering Senior Design Project John RuppertJustin HnatowJared Holsopple This.
Lie Detection System Using Micro-Expressions
Electrical and Computer Engineer Large Portable Projected Peripheral Touchscreen Team Jackson Brian Gosselin Jr. Greg Langlois Nick Jacek Dmitry Kovalenkov.
Kalman Tracking for Image Processing Applications Student : Julius Oyeleke Supervisor : Dr Martin Glavin Co-Supervisor : Dr Fearghal Morgan.
Augmented Reality and 3D modelling Done by Stafford Joemat Supervised by Mr James Connan and Mr Mehrdad Ghaziasgar.
HAND GESTURE BASED HUMAN COMPUTER INTERACTION. Hand Gesture Based Applications –Computer Interface A 2D/3D input device (Hand Tracking) Translation of.
Code: CS 893 JIS College of Engineering, WBUT The Augmented Reality Technique Abhinav Biswas & Soumalya Dutta.
By: Simon Kleinsmith Supervisor: Mr Mehrdad Ghaziasgar Co-supervisor: Mr James Connan.
ELECTRONIC CONDUCTING SYSTEM Kenzo Abrahams Supervisor: Mehrdad Ghaziasgar Co-supervisor: James Connan Assisted by: Diego Mushfieldt.
Hand Gesture Recognition System for HCI and Sign Language Interfaces Cem Keskin Ayşe Naz Erkan Furkan Kıraç Özge Güler Lale Akarun.
By Roland Foster Supervisors: Mr. Mehrdad Ghaziasgar Mr. James Connan Mentor: Mr. Warren Nel.
Anaglyph Videos Student:Jihaad Pienaar Supervisor: Mehrdad Ghaziasgar & James Connan Mentors: Roland Foster & Kenzo Abrahams.
Gesture-Based Interactive Beam- Bending By Justin Gigliotti Mentored by Professor Tarek El Dokor And Dr. David Lanning Arizona/NASA Space Grant Symposium.
By: Hadley Scholtz Supervisor: Mehrdad Ghaziasgar Co – supervisor: James Connan Assisted by: Ibraheem Frieslaar.
Capturing, Encoding and Rendering Gestures using the Kinect Adan Patience Supervisor: Mr. Mehrdad Ghaziasgar Co-Supervisor: Mr. Reginald McDonald Dodds.
Hand Tracking for Virtual Object Manipulation
By: Hadley Scholtz Supervisor: Mehrdad Ghaziasgar Co - supervisor: James Connan Mentor: Ibraheem Frieslaar.
By: Hadley Scholtz Supervisor: Mehrdad Ghaziasgar Co - supervisor: James Connan Mentor: Ibraheem Frieslaar.
Real-Time Cyber Physical Systems Application on MobilityFirst Winlab Summer Internship 2015 Karthikeyan Ganesan, Wuyang Zhang, Zihong Zheng.
MA/CS 3751 Fall 2002 Lecture 24. MA/CS 3752 ginput ginput is a Matlab function which takes one argument input: number of points to select in the image.
HCI / CprE / ComS 575: Computational Perception Instructor: Alexander Stoytchev
Turns human body into a touch screen finger input Interface. By, M.PRATHYUSHA 07P61A1261 IT-B.
Real-Time Cyber Physical Systems Application on MobilityFirst Winlab Summer Internship 2015 Karthikeyan Ganesan, Wuyang Zhang, Zihong Zheng.
By: Kenzo Abrahams Supervisor: Mehrdad Ghaziasgar Co-supervisor: James Connan Mentored by: Diego Mushfieldt.
ELECTRONIC CONDUCTING SYSTEM An easy way of learning how to conduct music Kenzo Abrahams Supervisor: Mehrdad Ghaziasgar Co-supervisor: James Connon Assited.
Block Jam: A Tangible Interface for Interactive Music Michael Curry.
Anaglyph Videos Student:Jihaad Pienaar Supervisor: Mr Mehrdad Ghaziasgar Co-Supervisor:Mr James Connan Mentors: Mr Roland Foster & Mr Kenzo Abrahams.
SciFest Overview Neil Gannon. Outline Demonstrations using a Microsoft Kinect sensor – Image Manipulation Real-time Invisibility Background removal (green.
Kearan Mc Pherson Mr. J. Connan. Overview Introduction Design Decisions Implementation Project Plan Demo.
` Tracking the Eyes using a Webcam Presented by: Kwesi Ackon Kwesi Ackon Supervisor: Mr. J. Connan.
HCI/ComS 575X: Computational Perception Instructor: Alexander Stoytchev
CAPTURING, ENCODING AND RENDERING GESTURES USING THE KINECT by Adan Patience Supervisor: Mr Mehrdad Ghazi-Asgar.
Student: Ibraheem Frieslaar Supervisor: Mehrdad Ghaziasgar.
Virtual Desktop Peephole By Kyle Patience Supervisor: Reginald Dodds Co Supervisor: Mehrdad Ghaziasgar.
Virtual Desktop Peephole By Kyle Patience Supervisor: Reginald Dodds Co Supervisor: Mehrdad Ghaziasgar.
Virtual Image Peephole By Kyle Patience Supervisor: Reg Dodds Co Supervisor: Mehrdad Ghaziasgar.
Knowledge Systems Lab JN 1/15/2016 Facilitating User Interaction with Complex Systems via Hand Gesture Recognition MCIS Department Knowledge Systems Laboratory.
The Optical Telemeter Supervisor - Mr Reg Dodds Co – Supervisor - Mentor- Mr Dane Brown Presented by Mutende Msiska.
David Wild Supervisor: James Connan Rhodes University Computer Science Department Eye Tracking Using A Simple Webcamera.
By Roland Foster Supervisors: Mr. Mehrdad Ghaziasgar Mr. James Connan Mentor: Mr. Warren Nel.
Student: Ibraheem Frieslaar Supervisor: Mehrdad Ghaziasgar.
Capturing, Encoding and Rendering Gestures using the Kinect Adan Patience Supervisor: Mr. Merhdad Ghaziasgar Co-Supervisor: Mr. R Dodds Mentor: Mr. Kenzo.
Augmented Reality and 3D modelling Done by Stafford Joemat Supervised by Mr James Connan.
Outline Introduction Related Work System Overview Methodology Experiment Conclusion and Future Work.
Student: Thabang Kgwefane Supervisor: James Connan.
RoboCup KSL Design and implementation of vision and image processing core Academic Supervisor: Dr. Kolberg Eli Mentors: Dr. Abramov Benjamin & Mr.
APRIL 10, Meeting Agenda  Prototype 2 Goals  Robust Connections Demo  System Diagnostics Tool Demo  Final Prototype Risk Mitigation  Final.
Anaglyph Videos Student:Jihaad Pienaar Supervisor: Mr Mehrdad Ghaziasgar Co-Supervisor:Mr James Connan Mentors: Mr Roland Foster & Mr Kenzo Abrahams.
Augmented Reality and 3D modelling Done by Stafford Joemat Supervised by Mr James Connan and Mehrdad Ghaziasgar.
Over the recent years, computer vision has started to play a significant role in the Human Computer Interaction (HCI). With efficient object tracking.
Student: Dane Brown Supervisor : James Connan Co-Supervisor : Mehrdad Ghaziasgar.
FINGERSPELLING THE SIGN LANGUAGE TOOL
Hand Gestures Based Applications
AHED Automatic Human Emotion Detection
Color Tracking.
Moving The Mouse Pointer Using Eye Gazing
FISH IDENTIFICATION SYSTEM
Higher School of Economics , Moscow, 2016
HCI/ComS 575X: Computational Perception
SUSPICIOUS ACTIVITY DETECTION
AHED Automatic Human Emotion Detection
AHED Automatic Human Emotion Detection
FISH IDENTIFICATION SYSTEM
FISH IDENTIFICATION SYSTEM
Higher School of Economics , Moscow, 2016
Presentation transcript:

Electronic Conducting System Kenzo Abrahams Supervisor: Mehrdad Ghaziasgar Co-supervisor: James Connon Mentored by: Diego Mushfieldt

Overview Introduction User Interface Specification High Level Design Low level Design Demo

Introduction Interactive Conducting System Tracks hands using webcam Real-time alterations depending on hand gestures ◦ Change volume ◦ Change tempo

User Interface Specification Graphic User Interface (GUI) User interacts using webcam and mouse

User Interface Specification

High Level Design The solution can be broken up into 3 parts ◦ Input ◦ Image processing ◦ Adjust music

Low Level Design

Input ◦ The frames need to be acquired from the webcam ◦ cvQueryFrame(capture)

Low Level Design Image Processing ◦ Convert copies of frames to HSV colour space ◦ cvCvtColor(frame, img_hsv, CV_BGR2HSV)

Low Level Design Image Processing ◦ Skin segmentation ◦ Use of a predefined method ◦ Detect if pixels in the rectangle fall within a certain range

Low Level Design Image Processing ◦ Set hand as region of interest ◦ cvSetImageROI(pHueImg, pHandRect )

Low Level Design Image Processing ◦ Perform Camshift cvCamShift( pProbImg, prevHandRect, cvTermCriteria( CV_TERMCRIT_EPS | CV_TERMCRIT_ITER, 10, 1 ), &components, &HandBox )

Low Level Design Adjust Music ◦ Library RTcmix is used to produce the music ◦ Can be imbedded into C++ code ◦ load("WAVETABLE") ◦ Wave = maketable("wave", 1000, "tri") ◦ WAVETABLE(start time, duration, amp, frequency, pan, wave)

Demo Overview of demonstration ◦ Track the right hand ◦ Perform the gestures that are going to be used ◦ Track the left hand ◦ Perform simple gestures such as raising and lowering hand ◦ Move each hand separately

Project Plan GoalDue Date Learn how to use OpenCV and its tools. Elicit the requirements and define a designers interpretation of the problem. Completed From the users requirements design a prototype for the system Completed Construct the system with all its functionality present Term 3 Test the system and deployment Term 4

References O'Niel, D L. (2008). Music Theater Jobs. Available: Last accessed 29th March Bradski, G Kaehler, A. (2008). Getting to Know OpenCV. In: Loukides, M Learning OpenCV. United States of America: O’Reilly. pp Ivanciuc, O. (2005). SVM - Support Vector Machines. Available: Last accessed 29th March Nakra, T., Ivanov, Y., Smaragdis, P., Ault, C. (2009). The USB Virtual Maestro: an Interactive Conducting System, pp , NIME2009 Borchers, J., Hadjakos, A., M¨uhlh¨auser, M. (2006), MICON a music stand for interactive conducting. Proceedings of the 2006 conference on New interfaces for musical expression, pp254–259.

Questions and Answers