Download presentation
1
Multimedia Framework and Libraries
Heejune AHN Embedded Communications Laboratory Seoul National Univ. of Technology Fall 2011 Last updated
2
Agenda Intro to Multimedia Framework FFMPEG Library Khronos’s OpenMAX
MicroSoft’s DirectX
3
1. Multimedia framework Why framework Famous multimedia framework
Provides base structure for component-based development It is the way many people work for a big system ! Multimedia system shares patterns. While(!stop){ read; decode; render; } Famous multimedia framework ffmpeg Gstreamer (KDE) Openmax (Khronos Group) DirectX Framework
4
2. FFMPEG Ffmpeg (http://ffmpeg.org)
free (GNU LGPL license) multimedia library Farbrice Bellard Since 2004, “ff” = Fast Forwards Main part libavcodec : video/audio encoder and decoder libavformat : mux-demux and file format command line tools : ffmpeg (transcoder), ffplay, ffserver
5
2. FFMPEG : libavcodec structure
AVCodecContext avcodec_register_all avcodec_find_encoder avcodec_alloc_context // new 와 같음 avcodec_ open // 코덱을 연결하고 초기화함. avcode_close // 코덱 사용을 종료 함. samples AVFrame avcodec_encode_video buf (compressed video) avcodec_encode_audio buf (compressed audio) buf (compressed video) avcodec_decode_video samples AVFrame buf (compressed audio) avcodec_decode_audio codec name type // A, V etc id // standard init(); encode(); decode(); close(); AVCodec : 함수덩어리임.
6
2. FFMPEG : encoding outline
Encoding (libavcodec/apiexample.c) avcodec_init( ); // libavcodec 을 초기화 한다. avcodec_register_all( ); // 모든 코덱을 등록한다. codec = avcodec_find_encoder(CODEC_ID_XXXX); // 해당하는 코덱을 먼저 찾는다. (코덱이 없으면 ctx 만들어봐야 소용없으므로) ctx = avcodec_alloc_context(); // 코덱 객체를 생성한다 ctx-> // frame_rate, bir_rate, tolerance, qscale, picture format, picture resolution 등 코딩 파라메터를 세팅 avcodec_open(ctx, codec); // ctx 에 codec 을 연결한다. unsigned char *comp_buf[BUF_SIZE]; AVPicture *picture; // set the data[4] and linesize[4] for YUV420P 포맷 avcodec_encode_video(ctx, comp_buf, size, picture); avcodec_close(ctx); // 사용을 종료 ? av_free(ctx); // 메모리 및 객체 제거
7
2. FFMPEG : decoding outline
Decoding (libavcodec/apiexample.c) avcodec_init( ); // libavcodec 을 초기화 한다. avcodec_register_all( ); // 모든 코덱을 등록한다. codec = avcodec_find_decoder(CODEC_ID_XXXX); // 해당하는 코덱을 먼저 찾는다. (코덱이 없으면 ctx 만들어봐야 소용없으므로) ctx = avcodec_alloc_context(); // 코덱 객체를 생성한다 ctx-> // frame_rate, bir_rate, tolerance, qscale, picture format, picture resolution 등 코딩 파라메터를 세팅 avcodec_open(ctx, codec); // ctx 에 codec 을 연결한다. AVPacket pkt; // 그리고 read 함 (e.g., av_read_frame()) AVFrame * frame; // set the data[4] and linesize[4] for YUV420P 포맷 avcodec_decode_video(ctx, frame, &gotFrame, pkt.data, pkt.size); if(gotFrame) display or process avcodec_close(ctx); // 사용을 종료 ? av_free(ctx); // 메모리 및 객체 제거
8
2. FFMPEG: libformat structure
AVFormatParameter AVInputFormat av_open_input_file // new with input file av_alloc_format_context // new wo input file av_guess_format // find appropriate avformat name read_probe read_header read_packet read_seek2 read_timestamp read_play read_pause read_close file iformat av_read_packet AVPacket AVStream AVCondecCtx * codec; streams[ ] AVOutputFormat oformat av_write_header av_write_frame av_(interleaved)_write_frame av_dump_format AVPacket AVPacket name mime_type write_header write_packet write_trailer data pts/dts stream_id
9
2. FFMPEG : muxing outline
Encoding (libavformat/out-example.c) register mux/demux create instances set up codecs for each stream 4. write stream packets 5. destroy instance av_register_all(); fmt = av_open_input_file(&ctx, filename, NULL, 0 NULL); st = ctx->streams[sidx]; codecctx = st->codec; avcodec_find_decoder(CODEC_ID_XXX); avcodec_open(ctx,.codec); av_read_frame(ctx, &packet); avcodec_decode_video(codecctx, &Frame, &gotframe, pkt.data, pkt.size); (use the decoded frame ) av_close(ctx); av_free(ctx);
10
2. FFMPEG : demuxing outline
demux (ffplayer.c) register mux/demux create instances create and configure streams 4. write stream packets 5. destroy instance av_register_all(); fmt = av_guess_format(fmtname, filename, mimetype); ctx = avformat_alloc_context(); ctx->oformat = fmt; st = av_new_stream(ctx, 0); st-codec->xxx = xxx; (codec_id, codec_type, bit_rate, av_set_paramters(ctx, NULL); width, height, timebase) (prepare codecs if required) av_write_header(ctx); // get pkt (e.g. encoding a picture) av_interleaved_write_frame(oc, &pkt); av_write_trailer(oc); av_close(ctx); av_free(ctx);
11
4. . History Abstraction layer and interface API
History Khronos Group Inc. (famous for openGL and openCL etc) Release: DL 1.02 (2007), IL (2008), AL 1.1 (2010) Abstraction layer and interface API Portability for different HW and application environ. C-language, but with OOP concept
12
OpenMAX AL, IL, DL AL (API for application programmers)
video/audio playback and recoding (play, stop, pause, record) camera control, Image capture and display (capture and show) Meta data extraction and insertion etc IL (API for framework developers) Core framework components : source, codec, sink, filter, splitter, mixer, etc By codec venders, e.g. Packet Video Bellagio Ref. Impl. (STMicrotech) on Linux DL (API for component developers) filter, transform, scaling, fft, etc Domains (AC: audio codecs, IC:image codecs, IP:image processing, SP: signal processing, VC:video codecs) By HW venders, ARM, INTEL, maybe NVIDIA
13
OpenMAX IL 3 role players APIs
IL Client : IL API user (commands, event callback) IL Core : the IL engine (load, configure, connect components) IL Components : function blocks (HW or SW) APIs Core API Component API
14
Media graph Core and Core API Control & Data API
Component loading/unloading and building a Graph Control & Data API Commands and Data Flow (not thru. Core for speed-up) Example of media graph
15
OpenMAX IL : Action Flow
Component Life cycle IL client control the state of component dynamic/static resource (1) register (4) loading (2) OMX_Init (3) Load_Cmp (5) Connect Has static resource Has context Errornous Has context
16
OpenMAX IL : Communication
Port fmt : video/audio/img/other output, Input Communication Mechanisms Non-tunneling, tunneling, proprietary Tunneling UseBuffer Call Supplier/non supplier port allocator/shared port Flow control Out <=In :OMX_FillThisBuffer Out=>In : OMX_EmptyThisBuffer Component Profile Base : For easy to implement Interoperability : tunneling support
17
OpenMAX IL : Thread Control
In-context vs out-of-context (i.e., sync. vs async.) callback is used before after the client multi-thread/process for multi-core and HW accelerator
18
OpenMAX IL : Integration
With Existing Media Framework Gstreamer plugin Android PV OpenCore
19
3. DirectShow History DShow Video for Windows (VFW): 1991, Windows 3.1
ActiveMovie in1995 (code name Quartz), Windows 95, COM-based DirectShow in1996, just renamed, in the DirectX family DirectX 8.0 in 2000, DirectX 9.0 in 2002, DirectX 11.0 in 2009 Now included MS platform SDK (for download) DShow COM (C++) library and runtime for multi-media processing filter-graph-manager and filters streaming (data flow) process
20
DirectShow Architecture
Filters A COM object with Interfaces (e.g. IBase_Filter ) Pin and connection input, output Pins, Connection between in and out pins Type Source, transform (splitter, mixer), renderer filter SourceFilter TransformFilter (Splitter) TransformFilter RendererFilter RendererFilter
21
Graph Editor Graph Editor DShow built-in GUI filter graph IDE
Can build graphs and peep the filter from a running process. Filters are logged in System Registry
22
Filter Graph Building IGraphBuilder Connection Thread
create a filter instance (using CLSID), connect, and control IMediaControl , IMediaEventEx, IMediaPoistion Inerfaces In fact, they are different interface for one instance. Connection Requirement : Pin type (input and output) => media type (audio, video, samples)=> transport mechanism (i.e., pull, push and buffer allocation) Intelligent or manual Filter Graph Manager can insert most suitable filters between two filters. Thread one thread for one filter graph CAMthread : A filter can have its-own thread, by inheriting it. E.g. stream is auto-threaded pin (run, stop, pause etc)
23
Comment for DS Developers
Must have Strong COM understanding All detailed code is COM instance and interface handling. Most Simple Code Snipset Initialize the COM library. HRESULT hr = CoInitializeEx(NULL, COINIT_APARTMENTTHREADED); Get a GraphBuilder IGraphBuilder *pGraph = NULL; hr = CoCreateInstance(CLSID_FilterGraph, NULL, CLSCTX_INPROC_SERVER,IID_IGraphBuilder,(void**)&pGraph); Get Control Interface hr = pGraph->QueryInterface(IID_IMediaEvent, (void **)&pEvent); Render file OR hr = pGraph->RenderFile(wfilename, NULL)); Or Add Fitlers hr = pGraph->AddSourceFilter(wFileName, wFileName, &pInputFileFilter); hr = CoCreateInstance(CLSID_DSoundRender, NULL, CLSCTX_INPROC_SERVER, IID_IBaseFilter, (void **)&pDSoundRenderer); hr = pGraph->AddFilter(pDSoundRenderer, L"Audio Renderer");
24
Most Simple Code Snipset (Cont’d)
Connect Pins pFileOut = GetPin(pInputFileFilter, PINDIR_OUTPUT); pWAVIn = GetPin(pDSoundRenderer, PINDIR_INPUT); hr = pGraph->Connect(pFileOut, pWAVIn); Control hr = pControl->Run(); Release COM instances pFileOut->Release(); pWAVIn->Release(); pInputFileFilter->Release(); pDSoundRenderer->Release(); pControl->Release(); pEvent->Release(); pGraph->Release(); Com Release CoUninitialize();
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.