Presentation is loading. Please wait.

Presentation is loading. Please wait.

Introduction To XAudio2 © Allan C. Milne Abertay University v14.1.15.

Similar presentations


Presentation on theme: "Introduction To XAudio2 © Allan C. Milne Abertay University v14.1.15."— Presentation transcript:

1

2 Introduction To XAudio2 © Allan C. Milne Abertay University v14.1.15

3 Agenda.  The XAudio2 pipeline.  Playing a sound.  XAUDIO2_BUFFER.  Sound Elements.

4 General Audio Pipeline. Original sound  microphone  ADC  wav file  pre-production  audio program code  soundcard  DAC  amp  speakers  sound

5 XAudio2 Pipeline..wav file  XAudio2 buffer / wave format  source voice  submix voice  mastering voice  soundcard

6 Source Voices. Operate on audio data provided by the client program. Send output to –1 or more submix voices; and/or –the mastering voice.

7 Submix Voices. Mix the audio from all voices feeding them. Operate on the result of this mix. Send output to –1 or more other submix voices; and/or –the mastering voice.

8 Mastering Voice. Mixes the audio from all voices feeding it Operates on the result of this mix. Writes the audio data to an audio device. There will normally be only one mastering voice.

9 Audio Processing Graph. The voice objects with their connections form an audio processing graph. Source voice objects are the entry points into this graph. the mastering voice is the exit from the graph to the audio device. The Xaudio2 engine processes and manages this graph.

10 Playing A sound. The following slides go through the coding steps for playing a sound from a.wav file. In summary we have to –create an XAudio2 engine; –create a mastering voice; –create a source voice; –submit the sound sample.

11 Setting Up. #include #include "PCMWave.hpp" using AllanMilne::Audio::PCMWave; using AllanMilne::Audio::WaveFmt; Also requires relevant libraries in the project set- up. PCMWave is my encapsulation.

12 Create The XAudio2 Engine. IXAudio2 *gEngine; … … … CoInitializeEx( NULL, COINIT_MULTITHREADED ); XAudio2Create( &gEngine ); The managing processor for the audio graph. All XAudio2 function calls return an HRESULT value that should be checked for success. The CoInitializeEx call allows XAudio2 to run in a separate thread. The engine is the only COM object in XAudio2.

13 Create The Mastering Voice. IXAudio2MasteringVoice *gMaster; … … … gEngine->CreateMasteringVoice( &gMaster ); The final rendering component, connected to the audio device. Created by the XAudio2 engine.

14 Creating A Source Voice. To create a source voice we need a wave format struct that defines the attributes of the wave audio data. To create this wave format struct we need to load the.wav file and extract the attributes from its fmt chunk. Therefore we need to –load the.wav file; –create a WAVEFORMATEX struct; –create the source voice object.

15 Load A.wav File. PCMWave *gWave; … … … gWave = new PCMWave ("MySound.wav"); if (gWave->GetStatus() != PCMWave::OK) { … … … } My own class to wrap file loading functionality. Creating a PCMWave object reads the fmt and data chunks. Check explicitly for success since this is my own encapsulation, no HRESULT value is returned.

16 Define Wave Format. WAVEFORMATEX gWFmt; … … … memset ((void*)&gWFmt, 0, sizeof (WAVEFORMATEX)); memcpy_s ((void*)&gWFmt, sizeof (WaveFmt), (void*)&(gWave- >GetWaveFormat()), sizeof (WaveFmt)); WAVEFORMATEX is a Windows struct. Defines the attributes of the audio data. Is copied from the PCMWave WaveFmt struct.

17 Create A Source Voice. IXAudio2SourceVoice *gSource; … … … gEngine->CreateSourceVoice( &gSource, &gWFmt); gSource->Start(); Managed by the XAudio2 engine. Wave format defines the format of all sound samples submitted to it. Called with only 2 arguments routes the source voice to the mastering voice. Note Start() activates the source voice but since we have not submitted any audio data to it nothing will be played yet.

18 Submit The Sound Sample. To play a sound sample we must submit it to a source voice. We need a XAUDIO2_BUFFER that defines the audio data and how it is to be handled. Therefore we need to –create an XAUDIO2_BUFFER; –submit the buffer to the source voice.

19 Define An XAudio2 Buffer. XAUDIO2_BUFFER gXABuffer; … … … memset ((void*)&gXABuffer, 0, sizeof (XAUDIO2_BUFFER)); gXABuffer.AudioBytes = gWave->GetDataSize (); gXABuffer.pAudioData = (BYTE*)(gWave->GetWaveData ()); Used to define audio data buffer and characteristics. Here we only define audio data size and audio data. All other characteristics are set to 0. Note audio data buffer points to the buffer in the PCMWave object.

20 Play The Sound. gSource->SubmitSourceBuffer (&gXABuffer); Submits a sound sample to the audio graph. The sound sample will be played only once. Multiple calls to Submit will queue the requests. This is an asynchronous operation.

21 Tidying Up. gSource->Stop (); gEngine->Release(); CoUninitialize(); delete gWave; Stop de-activates the source voice from the audio graph. Only the engine is released since this is the only COM object. PCMWave is deleted as this is an object of my own class. –Note this will free the audio data buffer.

22 XAUDIO2_BUFFER. XAUDIO2_BUFFER defines to the source voice the audio data and how to handle it. It defines –the audio data samples; –where to begin and stop playing within the audio data; –whether to loop, what to loop, and how many times.

23 .AudioBytes // number of bytes of audio data..pAudioData // pointer to audio data samples..Flags // almost always 0..PlayBegin // First sample in the buffer that should be played..PlayLength // Number of samples to play; 0=entire buffer (begin must also be 0)..LoopBegin // First sample to be looped; must be <(PlayBegin+PlayLength); can be <PlayBegin..LoopLength // Number of samples in loop; =0 indicates entire sample; PlayBegin > (LoopBegin+LoopLength) < PlayBegin+PlayLength)..LoopCount // Number of times to loop; =XAUDIO2_LOOP_INFINITE to loop forever; if 0 then LoopBegin and LoopLength must be 0..pContext // pointer to context to be passed to the client in callbacks.

24 Sound Elements. We will now summarise the elements that define a sound. these elements include bare data, XAudio2 components and my framework components. the main objects are of type: –PCMWave. –XAUDIO2_BUFFER. –WAVEFORMATEX. –IXAudio2SourceVoice.

25 PCMWave string filename; Status status; (enum) char *sample_data; WaveFmt format; … SAMPLE DATA 001010100101… (Just raw data) XAUDIO2_BUFFER … Info on looping and sample data. IXAudio2SourceVoice … WaveFormatEx … (look it up!) Submit( );


Download ppt "Introduction To XAudio2 © Allan C. Milne Abertay University v14.1.15."

Similar presentations


Ads by Google