Motion in Sound: Designing Sound for Interactive Dance Performance Dr. Dan Hosken Associate Professor of Music California State University, Northridge.

Slides:



Advertisements
Similar presentations
Music Education IN POLISH SCHOOL based on contemporary methods of music education.
Advertisements

Robert Wechsler Workshop/Roundtable McGill University, Feb.29-Mar.2, 2012 MotionComposer - the search for coherence in movement and sound
Using Multimedia on the Web Enhancing a Web Site with Sound, Video, and Applets.
CNIT 132 – Week 9 Multimedia. Working with Multimedia Bandwidth is a measure of the amount of data that can be sent through a communication pipeline each.
1 Multimedia on the Web: Issues of Bandwidth Bandwidth is a measure of the amount of data that can be sent through a communication pipeline each second.
David Meredith Minim David Meredith
SWE 423: Multimedia Systems Chapter 3: Audio Technology (2)
Software Defined Radio Testbed Team may11-18 Members: Alex Dolan, Mohammad Khan, Ahmet Unsal Adviser: Dr. Aditya Ramamoorthy.
Granular Synthesis in Next- Generation Games Game Developer's Conference 2006 San Jose Leonard J. Paul Vancouver Film School Game Audio Instructor lpaul.
Chapter 7 Principles of Analog Synthesis and Voltage Control Contents Understanding Musical Sound Electronic Sound Generation Voltage Control Fundamentals.
1 CMSHN1114/CMSCD1011 Introduction to Computer Audio Lecture 9: Computer audio applications Dr David England School of Computing and Mathematical Sciences.
Sample vs. Tick Absolute timeline Unchanging, regardless of tempo – Samples – Min:Secs – Time Code – Feet + Frames Relative timeline Change dynamically.
Art, Science, and Interactivity. Ben Fry Valence Valence is a set of software related sketches about building representations that explore the structures.
L4-1-S1 UML Overview © M.E. Fayad SJSU -- CmpE Software Architectures Dr. M.E. Fayad, Professor Computer Engineering Department, Room #283I.
Music Processing Roger B. Dannenberg. Overview  Music Representation  MIDI and Synthesizers  Synthesis Techniques  Music Understanding.
UFCEXR-20-1Multimedia Sound Production An Introduction to the Module.
Customizable Audio Kaleidoscope Agustya Mehta, Dennis Ramdass, Tony Hwang Final Project Spring 2007.
Introduction to Digital Audio An Overview. Sound In Media Sound Design gives meaning to noise, music and dialog A good design makes the listener immerse.
Music Processing Roger B. Dannenberg. Overview  Music Representation  MIDI and Synthesizers  Synthesis Techniques  Music Understanding.
Copyright 2002 Multimedia – Images, Sounds, and Motion Professor Robert Sandberg Charter College of Education California State University - Los Angeles.
Educational Computer Architecture Experimentation Tool Dr. Abdelhafid Bouhraoua.
King Saud University College of Applied studies and Community Service 1301CT By: Nour Alhariqi 1nalhareqi st semester
Digital Audio Multimedia Systems (Module 1 Lesson 1)
 Granular Synthesis: an overview. Overview  Sounds are made up of a large number of particles!  Examples of granular sounds Leaves Traffic Babbling.
Digital Sound and Video Chapter 10, Exploring the Digital Domain.
Introduction to Interactive Media 10: Audio in Interactive Digital Media.
MIDI. A protocol that enables computers, synthesizers, keyboards, and other musical devices to communicate with each other. Instead of storing actual.
Multimedia Technology Digital Sound Krich Sintanakul Multimedia and Hypermedia Department of Computer Education KMITNB.
MIDI and YOU Orchestra in a Box. What is MIDI?  Musical Instrument Digital Interface  MIDI is a protocol - a standard by which two electronic instruments.
WHAT IS TRANSCRIBE! ? Transcribe! is computer software that was designed to help people transcribe music from recordings. Transcribe- Learning to play.
Granular Synthesis in Next- Generation Games Game Developer's Conference March 2006 San Jose Leonard J. Paul Vancouver Film School Game Audio Instructor.
A Breath in an Electronic World: Experiments in Musical Expression using a Midi Wind Controller Matthew Ahrens Mentor: Dr. James Bohn Bridgewater State.
Home entertainment. The hardware, software and techniques used for sound MP3 players: play music files stored digitally in MP3 file format are small and.
Video Game Audio Prototyping with Half-Life 2 :: Granulation Leonard J. Paul VideoGameAudio.com.
Multimedia Elements: Sound, Animation, and Video.
Reason Devices Subtractor. Oscillators Select Waveform The Subtractor has two oscillators that can be used as sound sources for your patches Tuning Mix:
Interactive Spaces Huantian Cao Department of Computer Science The University of Georgia.
Multimedia Technology and Applications Chapter 2. Digital Audio
Introduction to Interactive Media 03: The Nature of Digital Media.
Computer Programming For Musical Applications II Tutorial 05 SuperCollider Sound Fundamentals 07 November, 2008.
MULTIMEDIA INPUT / OUTPUT TECHNOLOGIES INTRODUCTION 6/1/ A.Aruna, Assistant Professor, Faculty of Information Technology.
SuperCollider Sounds, Interactive Visuals By Simon Katan For Openlab Workshops
ICMC 2004 – Nov. 5 1 Andante: Composition and Performance with Mobile Musical Agents Leo Kazuhiro Ueda Fabio Kon
Theme: Multimedia Sound ProductionUFCFY Multimedia Sound Production.
Celluloid An interactive media sequencing language.
Introduction to Digital Audio An Overview.  Sound Design gives meaning to noise, music and dialog  A good design makes the listener immerse into the.
MMDB-8 J. Teuhola Audio databases About digital audio: Advent of digital audio CD in Order of magnitude improvement in overall sound quality.
CSCI-100 Introduction to Computing Hardware Part II.
DESIGN OF SOFTWARE ARCHITECTURE
For use with WJEC Performing Arts GCSE Unit 1 and Unit 3 Task 1 Music Technology Technical issues.
CEN5064/CEN4021 SRAD Denis Antoine(Team Leader) Harika Chirumamilla(Time Keeper) Marcelo Lopez(Minute Taker) Jean-Marc Rodriguez(Cross Functional/ Development)
Some Thoughts on Composing a Navigable Space Dr. Dan Hosken Assistant Professor of Music California State University, Northridge Presented at: SEAMUS 2004.
Nick Kwolek Martin Pendergast Stephen Edwards David Duemler.
David DuemlerMartin Pendergast Nick KwolekStephen Edwards.
Wekinator
Project Presentation Eoin Culhane Multi Channel Music Recognition for an Electric Guitar.
Multi Channel Music Recognition for an Electric Guitar.
Final Year Project Eoin Culhane. MIDI Guitar Guitar with 6 outputs 1 output for each string Each individual string output will be converted to MIDI.
Part 2: Interactivity. In this section Using the laptop’s inputs Adding game controllers and MIDI controllers Interacting with controllers, other software,
27656 (v1) Demonstrate and apply introductory knowledge of music technology equipment and techniques MUSIC Level 1Credits 4.
Garage Band For MAC. What is it? A digital audio workstation that can record and play back multiple tracks of audio. Is a software application for OS.
2007/11/16 Dinh Trong Thuy RTLab
Sound and music.
ECE Computer Engineering Design Project
Musical Toys: Interactive Audio for Non-Musicians
CSC 320 – Music Instrument Digital Interface (MIDI) and Digital Audio, Spring 2017 April 2017 Dr. Dale Parson.
Multimedia: making it Work
ECE Computer Engineering Design Project
Introduction Analog and Digital Signal
♪ Embedded System Design: Synthesizing Music Using Programmable Logic
Presentation transcript:

Motion in Sound: Designing Sound for Interactive Dance Performance Dr. Dan Hosken Associate Professor of Music California State University, Northridge Presented at: ATMI 2006 San Antonio, TX September 16, 2006

Purpose Present a somewhat simplified and useful approach to creating for the interactive dance medium Present a somewhat simplified and useful approach to creating for the interactive dance medium Facilitate collaboration between students of dance and students of music Facilitate collaboration between students of dance and students of music

Objectives: Give an overview of the hardware and software components of a camera-based interactive dance/music system Give an overview of the hardware and software components of a camera-based interactive dance/music system Present a loose taxonomy of motion parameters and mapping types Present a loose taxonomy of motion parameters and mapping types Suggest some useful mappings between motion parameters and sound element parameters Suggest some useful mappings between motion parameters and sound element parameters Illustrate these mappings using examples of my recent work with the Palindrome IMPG Illustrate these mappings using examples of my recent work with the Palindrome IMPG

General System Overview Camera trained on dancer(s) is connected to computer Camera trained on dancer(s) is connected to computer Video Analysis Software abstracts motion data in realtime Video Analysis Software abstracts motion data in realtime Motion Data are passed to Sound Software Motion Data are passed to Sound Software Sound Software maps incoming motion data to sound element parameters in realtime Sound Software maps incoming motion data to sound element parameters in realtime

Overview w/ bad clipart video computer audio computer ethernet

Sound Generation Software Max/MSP (Cycling ‘74) Max/MSP (Cycling ‘74) Max/MSP PD (Miller Puckette)—free! PD (Miller Puckette)—free! PD Supercollider (J. McCartney)—free! Supercollider (J. McCartney)—free! Reaktor (Native Instruments) Reaktor (Native Instruments) Reaktor …and any software that can receive data and produce sound in realtime …and any software that can receive data and produce sound in realtime

Video Analysis Software EyeCon (Frieder Weiss) EyeCon (Frieder Weiss) EyeCon EyesWeb (eyesweb.org)—free! EyesWeb (eyesweb.org)—free! EyesWeb Jitter (Cycling ‘74) Jitter (Cycling ‘74) Jitter SoftVNS (David Rokeby) SoftVNS (David Rokeby) Cyclops (Eric Singer/Cycling ‘74) Cyclops (Eric Singer/Cycling ‘74) Cyclops TapTools (Electrotap) TapTools (Electrotap) cv.jit (Jean-Marc Pelletier) cv.jit (Jean-Marc Pelletier) Eyes (Rob Lovel)—free! Eyes (Rob Lovel)—free!

Specific System Overview B/W camera (w/ IR filter) sends analog data to video computer (w/ framegrabber board) B/W camera (w/ IR filter) sends analog data to video computer (w/ framegrabber board) EyeCon (w/ custom drivers) abstracts motion data in realtime EyeCon (w/ custom drivers) abstracts motion data in realtime Motion data is sent via Ethernet using Open Sound Control (OSC) to music computer Motion data is sent via Ethernet using Open Sound Control (OSC) to music computer Max/MSP maps incoming motion data to suitable sound element parameters Max/MSP maps incoming motion data to suitable sound element parameters

Objectives (redux): Give an overview of the hardware and software components of a camera-based interactive dance/music system Give an overview of the hardware and software components of a camera-based interactive dance/music system Present a loose taxonomy of motion parameters and mapping types Present a loose taxonomy of motion parameters and mapping types Suggest some useful mappings between motion parameters and sound element parameters Suggest some useful mappings between motion parameters and sound element parameters Illustrate these mappings using examples of my recent work with the Palindrome IMPG Illustrate these mappings using examples of my recent work with the Palindrome IMPG

Definitions (1) Motion Parameter: made up of specified data abstracted from part or all of video, e.g., Motion Parameter: made up of specified data abstracted from part or all of video, e.g., Height Height Width Width Dynamic Dynamic Sound Element: a distinct, coherent sonic behavior created by one or more synthesis or processing techniques, e.g., Sound Element: a distinct, coherent sonic behavior created by one or more synthesis or processing techniques, e.g., A Low Drone created by FM Synthesis A Low Drone created by FM Synthesis Time-stretched text created by Granulation Time-stretched text created by Granulation Percussive patterns created by Sample Playback Percussive patterns created by Sample Playback

Definitions (2) Sound Element Parameter: a parameter of a synthesis/processing technique, e.g., Sound Element Parameter: a parameter of a synthesis/processing technique, e.g., Modulation Frequency of a simple FM pair Modulation Frequency of a simple FM pair Grain Size of a granulated sound file Grain Size of a granulated sound file Ir/regularity of tempo in a rhythmic pattern Ir/regularity of tempo in a rhythmic pattern Mapping: the connection between a motion parameter and a sound element parameter, e.g., Mapping: the connection between a motion parameter and a sound element parameter, e.g., Height  modulation frequency of FM Height  modulation frequency of FM Width  grain size of granulated sound file Width  grain size of granulated sound file Dynamic  Irregularity of tempo Dynamic  Irregularity of tempo

Definitions (3) Scene: a group of motion parameters, sound elements, and mappings between them Scene: a group of motion parameters, sound elements, and mappings between them

EyeCon Interface (1) Field: can measure height or width or dynamic or… Touchlines: detect crossing and position on line

EyeCon Interface (2) Fields and lines are mapped to MIDI data (or OSC) Sequencer steps through “scenes”

Taxonomy of Motion Parameters Body Parameters: position independent, “attached” to body Body Parameters: position independent, “attached” to body Height Height Width Width Dynamic Dynamic Stage Parameters: position dependent, “attached” to stage Stage Parameters: position dependent, “attached” to stage Left-right position Left-right position Touchlines Touchlines Extremely Narrow Fields Extremely Narrow Fields

Parameter Type Examples Stage Parameter (position): Scene 3 from Brother-Sister Solo Stage Parameter (position): Scene 3 from Brother-Sister Solo Julia Eisele, dancer/choregrapher Julia Eisele, dancer/choregrapher Stuttgart, June 2005 Stuttgart, June 2005 Body Parameter (Dynamic): Conversation Body Parameter (Dynamic): Conversation Robert Wechsler, dancer/choreographer Robert Wechsler, dancer/choreographer Julia Eisele, dancer Julia Eisele, dancer Stuttgart, June 2005 Stuttgart, June 2005

Primary/Secondary Mappings Primary Mapping: controls dominant sonic feature Primary Mapping: controls dominant sonic feature Secondary Mapping: …is secondary… Secondary Mapping: …is secondary… Example: Scene 3 from Brother-Sister Solo Example: Scene 3 from Brother-Sister Solo Primary mapping: position  position in sound “landscape” Primary mapping: position  position in sound “landscape” Secondary mapping: dynamic  disturbance of drone Secondary mapping: dynamic  disturbance of drone Secondary mapping: width  loop size/speed of segment within sound file Secondary mapping: width  loop size/speed of segment within sound file

Some Useful Sound Elements Soundfile (trigger playback) Soundfile (trigger playback) Low FM Drone (modulation frequency) Low FM Drone (modulation frequency) Mid-Range Additive Cluster (frequency deviation of components from norm) Mid-Range Additive Cluster (frequency deviation of components from norm) Sound file Granulation (position in file, pitch change) Sound file Granulation (position in file, pitch change)

Sound Element mappings (1) A Human Conversation (in progress) A Human Conversation (in progress) Scene 7-8: Scene 7-8: Dynamic  Granulated Text (playback rate) Dynamic  Granulated Text (playback rate) Scene 9: Scene 9: Dynamic (left)  Granulated Text (playback rate) Dynamic (left)  Granulated Text (playback rate) Dynamic (right)  Granulated Text (playback rate) Dynamic (right)  Granulated Text (playback rate)

A Human Conversation Robert Wechsler (Palindrome), choreographer/dancer Robert Wechsler (Palindrome), choreographer/dancer J’aime Morrison (CSUN), choreographer/dancer J’aime Morrison (CSUN), choreographer/dancer Dan Hosken, composer and sound programmer Dan Hosken, composer and sound programmer Work session, CSUN, June 23, 2006 Work session, CSUN, June 23, 2006

Sound Element mappings (2) Perceivable Bodies (Emily Fernandez) Perceivable Bodies (Emily Fernandez) Scene 3a: Scene 3a: Position  Granulated Text (position in file) [Primary] Position  Granulated Text (position in file) [Primary] Width  Granulated Text (grain duration) Width  Granulated Text (grain duration) Dynamic  Low FM Drone (mod frequency) Dynamic  Low FM Drone (mod frequency) Scene 3b: Scene 3b: Position  Phase Voc File (position in file) [Primary] Position  Phase Voc File (position in file) [Primary] Width  Phase Voc file (loop length/rate) Width  Phase Voc file (loop length/rate) Dynamic  Low FM Drone (mod frequency) Dynamic  Low FM Drone (mod frequency) Scene 4: Scene 4: Dynamic  Granulated Noise (density) [Primary] Dynamic  Granulated Noise (density) [Primary] Dynamic  Granulated Noise (position in file) Dynamic  Granulated Noise (position in file)

Perceivable Bodies Emily Fernandez, choreographer/dancer Emily Fernandez, choreographer/dancer Frieder Weiss, projections and interactive programming Frieder Weiss, projections and interactive programming Dan Hosken, composer and sound programmer Dan Hosken, composer and sound programmer World Premiere at Connecticut College, April 1, 2006 World Premiere at Connecticut College, April 1, 2006

Examples shown can be found: Full Pieces can be found: Other Examples of Palindrome’s work:

Max/MSP Screenshot

PD Screenshot

Reaktor Screenshots

Eyecon Screenshot

EyesWeb Screenshot

Jitter Screenshot

Cyclops Screenshot