MP design and implementation CS414: Multimedia System Instructor: Klara Nahrstedt April 13, 2012.

Slides:



Advertisements
Similar presentations
Embedded Streaming Media with GStreamer and BeagleBoard ESC-228 Presented by Santiago Nunez santiago.nunez (at) ridgerun.com.
Advertisements

Tae-wan You, Seoul National University, Korea
CS Spring 2012 CS 414 – Multimedia Systems Design Lecture 14 – Introduction to Multimedia Resource Management Klara Nahrstedt Spring 2012.
McGraw-Hill©The McGraw-Hill Companies, Inc., 2000 Chapter 28 Real-Time Traffic over the Internet.
29.1 Chapter 29 Multimedia Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display.
29.1 Chapter 29 Multimedia Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display.
RTP: A Transport Protocol for Real-Time Applications Provides end-to-end delivery services for data with real-time characteristics, such as interactive.
TCP/IP Protocol Suite 1 Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. Chapter 25 Multimedia.
MP 1: Audio/ Video Recorder and Player CS414: Multimedia System Instructor: Klara Nahrstedt February 1 st, 2012.
CS Spring 2012 CS 414 – Multimedia Systems Design Lecture 35 – Media Server (Part 4) Klara Nahrstedt Spring 2012.
Motivation Application driven -- VoD, Information on Demand (WWW), education, telemedicine, videoconference, videophone Storage capacity Large capacity.
CS Spring 2012 CS 414 – Multimedia Systems Design Lecture 15 –QoS Admission, QoS Negotiation, and Establishment of AV Connections Klara Nahrstedt.
User Control of Streaming Media: RTSP
Multimedia Streaming Protocols1 Multimedia Streaming: Jun Lu Xinran (Ryan) Wu CSE228 Multimedia Systems Challenges and Protocols.
1 School of Computing Science Simon Fraser University CMPT 820: Multimedia Systems RTP Media Synchronization for Live MobileVideo Streaming Bassam Almohammadi.
CS 582 / CMPE 481 Distributed Systems Communications (cont.)
Real-time Transport Protocol Matt Boutell CS457: Computer Networks November 15, 2001.
Media Streaming Protocols Presented by: Janice Ng and Yekaterina Tsipenyuk May 29 th, 2003 CSE 228: Multimedia Systems.
TCP/IP Protocol Suite 1 Chapter 25 Upon completion you will be able to: Multimedia Know the characteristics of the 3 types of services Understand the methods.
CS335 Principles of Multimedia Systems Multimedia Over IP Networks -- II Hao Jiang Computer Science Department Boston College Nov. 8, 2007.
K. Salah 1 Chapter 28 VoIP or IP Telephony. K. Salah 2 VoIP Architecture and Protocols Uses one of the two multimedia protocols SIP (Session Initiation.
CS Spring 2012 CS 414 – Multimedia Systems Design Lecture 34 – Media Server (Part 3) Klara Nahrstedt Spring 2012.
MP 1: Audio/ Video Recorder and Player CS414: Multimedia System Instructor: Klara Nahrstedt February 7 th, 2012.
RTP/RTCP – Real Time Transport Protocol/ Real Time Control Protocol Presented by Manoj Sivakumar.
Copyright © MilSOFT,Turkey UNCLASSIFIED1 Ertan DENIZ MilSOFT A.S, Teknokent ODTU,Ankara/Turkey Huseyin Kutluca,
CS Spring 2012 CS 414 – Multimedia Systems Design Lecture 32 – Media Server (Part 2) Klara Nahrstedt Spring 2012.
Video Streaming © Nanda Ganesan, Ph.D..
CS 218 F 2003 Nov 3 lecture:  Streaming video/audio  Adaptive encoding (eg, layered encoding)  TCP friendliness References: r J. Padhye, V.Firoiu, D.
CIS679: RTP and RTCP r Review of Last Lecture r Streaming from Web Server r RTP and RTCP.
Multimedia and QoS#1#1 Multimedia Applications. Multimedia and QoS#2#2 Multimedia Applications r Multimedia requirements r Streaming r Recovering from.
Embedded Streaming Media with GStreamer and BeagleBoard
Computer Networks: Multimedia Applications Ivan Marsic Rutgers University Chapter 3 – Multimedia & Real-time Applications.
CS Spring 2012 CS 414 – Multimedia Systems Design Lecture 28 – Final Comments on DASH and Client-Server Buffer Management Klara Nahrstedt Spring.
CS Spring 2012 CS 414 – Multimedia Systems Design Lecture 29 – Buffer Management (Part 2) Klara Nahrstedt Spring 2012.
1 How Streaming Media Works Bilguun Ginjbaatar IT 665 Nov 14, 2006.
Multimedia Over IP: RTP, RTCP, RTSP “Computer Science” Department of Informatics Athens University of Economics and Business Λουκάς Ελευθέριος.
TCP/IP Protocol Suite 1 Chapter 25 Upon completion you will be able to: Multimedia Know the characteristics of the 3 types of services Understand the methods.
CS640: Introduction to Computer Networks Aditya Akella Lecture 19 - Multimedia Networking.
Video.
London April 2005 London April 2005 Creating Video Ads The Rich Media Platform The Rich Media Platform Eyeblaster.
MP 2: Audio/ Video Streaming
MP3: Multi-view Surveillance System Instructor: Klara Nahrstedt April 20, 2012 CS414.
AXIS 250S MPEG-2 Video Server Full resolution live MPEG-2 video over your network.
03/11/2015 Michael Chai; Behrouz Forouzan Staffordshire University School of Computing Streaming 1.
McGraw-Hill©The McGraw-Hill Companies, Inc., 2004 Chapter 28 Multimedia.
Chapter 28. Network Management Chapter 29. Multimedia
CS Spring 2014 CS 414 – Multimedia Systems Design Lecture 18 – Multimedia Transport (Part 1) Klara Nahrstedt Spring 2014.
CS Spring 2014 CS 414 – Multimedia Systems Design Lecture 22 – Multimedia Session Protocols Klara Nahrstedt Spring 2014.
Klara Nahrstedt Spring 2012
CS Spring 2012 CS 414 – Multimedia Systems Design Lecture 20 – Multimedia Session Protocols Klara Nahrstedt Spring 2012.
Ch 6. Multimedia Networking Myungchul Kim
MP 2: Audio/ Video Streaming CS414: Multimedia System Instructor: Klara Nahrstedt March 16, 2012.
TCP/IP Protocol Suite 1 Chapter 25 Upon completion you will be able to: Multimedia Know the characteristics of the 3 types of services Understand the methods.
Ch 6. Multimedia Networking Myungchul Kim
CS Spring 2014 CS 414 – Multimedia Systems Design Lecture 30 – Final Comments on DASH and Client-Server Buffer Management Klara Nahrstedt Spring.
MP3: Multi-Source Streaming System Instructor: Klara Nahrstedt April 18, 2014 CS414.
Introduction to Quality of Service Klara Nahrstedt CS 538.
3/10/2016 Subject Name: Computer Networks - II Subject Code: 10CS64 Prepared By: Madhuleena Das Department: Computer Science & Engineering Date :
CS Spring 2014 CS 414 – Multimedia Systems Design Lecture 19 – Multimedia Transport Subsystem (Part 2) + Midterm Review Klara Nahrstedt Spring 2014.
CS Spring 2011 CS 414 – Multimedia Systems Design Lecture 24 – Client-Server Buffer Management Klara Nahrstedt Spring 2011.
CS Spring 2014 CS 414 – Multimedia Systems Design Lecture 17 – QoS Classes and Setup Operations Klara Nahrstedt Spring 2014.
Multimedia Communication Systems Techniques, Standards, and Networks Chapter 6 Multimedia Communication Across Networks.
11 CS716 Advanced Computer Networks By Dr. Amir Qayyum.
The Transport Layer Congestion Control & UDP
Chapter 29 Multimedia Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display.
Klara Nahrstedt Spring 2012
Klara Nahrstedt Spring 2009
Klara Nahrstedt Spring 2012
RTP: A Transport Protocol for Real-Time Applications
Chapter 25 Multimedia TCP/IP Protocol Suite
Presentation transcript:

MP design and implementation CS414: Multimedia System Instructor: Klara Nahrstedt April 13, 2012

Covered Aspects of Multimedia in CS414 MP Image/Video Capture Media Server Storage Transmission Compression Processing Audio/Video Presentation Playback Audio/Video Perception/ Playback Audio Information Representation Transmission Audio Capture A/V Playback Image/Video Information Representation

MP1: AV Capture and Playback Image/Video Capture Compression/ Processing Audio Information Representation Audio Capture Audio/Video Perception/ Playback

MP1: Video Capture and Store in a File Sr c Sink Src Sink Src Video Source Video Filter Media Encoder Sink Src Video Muxer Sink Video Sink final Element videosrc = ElementFactory.make("v4l2src", "source"); final Element videofilter = ElementFactory.make("capsfilter", "flt"); videofilter.setCaps(Caps.fromString("video/x-raw-yuv, width=640, height=480, framerate=10/1")); final Element formatConverter = ElementFactory.make("ffmpegcolorspace", "formatConverter"); final Element encoder = ElementFactory.make("ffenc_mpeg4", "encoder"); final Element muxer = ElementFactory.make("fmux_mpeg", "muxer"); final FileSink filesink = (FileSink) ElementFactory.make("filesink", "filesink"); filesink.setLocation("capture01.avi"); Snippet pipe.addMany(videosrc,formatConverter, videofilter, encoder, filesink); Element.linkMany(videosrc, formatConverter, videofilter, encoder, filesink); pipe.setState(State.PLAYING); Gst.main(); pipe.setState(State.NULL); START

Video Capture and Store in a Queue Video Source Video Filter final Element videosrc = ElementFactory.make("v4l2src", "source"); final Element videofilter = ElementFactory.make("capsfilter", "flt"); videofilter.setCaps(Caps.fromString(Caps.fromString(String.format("video/x-raw-yuv, width=%s, height=%s" + ", bpp=24, depth=16, framerate=%s/1",WIDTH+"", HEIGTH+"", CAPTURERATE+""))); videorate = ElementFactory.make("videorate", "videorate"); ratefilter = ElementFactory.make("capsfilter", “rateFilter"); ratefilter.setCaps(Caps.fromString(String.format("video/x-raw-yuv, framerate=%s/1”,FRAMERATE+""))); final Element encoder = ElementFactory.make(” theoraenc", ”H264"); final Element muxer = ElementFactory.make(”oggmux", "muxer"); final AppSink appsink = (FileSink) ElementFactory.make(”appsink", ”appsink"); Snippet Sr c Sink Src Media Encoder Video Muxer Sink Src Sink Src Sink App Sink Video Rate Rate Filter Sink Src Sink Src

Gstreamer-java AppSink Sink App Sink Element A callback function is used to get the frames from the AppSink Element appsink.set("emit-signals", true); appsink.setSync(false); appsink.connec(new AppSink.NEW_BUFFER(){ public void newBuffer(AppSink as){ Buffer buffer = as.getLastBuffer(); ByteBuffer buf = buffer.getByteBuffer(); byte[] b = new byte[buf.remaining()]; buf.get(b); int timestamp = (int)buffer.getTimestamp().toMillis(); ApplicationPacket packet = new ApplicationPacket(b, timestamp) Queue.enQueue(packet); } }); Queue Packet ArrayBlockingQueue

MP1: Video Playback from a File Pipeline pipe = new Pipeline(”Decode pipeline"); Element videosrc = ElementFactory.make("filesrc", ”filesrc"); src.set("location", "Avatar.avi"); DecodeBin2 decodeBin = new DecodeBin2("Decode Bin"); Element decodeQueue = ElementFactory.make("queue", "Decode Queue"); final VideoComponent videoComponent = new VideoComponent(); pipe.addMany(videosrc, decodeQueue, decodeBin); Element.linkMany(videosrc, decodeQueue, decodeBin); decodeBin.connect(new DecodeBin2.NEW_DECODE_PAD()){ public void newDecodePad (DecodeBin2 elem, Pad pad, boolean last){ Caps cap = pad.getCaps(); Structure struct = caps.getStructure(0); if(struct.getName().startsWith(“video/”){ pad.link(videoComponent.getElement.getStaticPad(“Sink”)); }}} Sr c Sink Src Media SourceMedia Decoder Sink Src Media DeMuxer Sink Media Sink Sink Src DecodeBin

Video Playback from a Queue Sr c Sink Src Sink Src App Source Video Filter Media Encoder Sink Src Video Muxer Sink Video Sink final AppSrc appsrc = (AppSrc) ElementFactory.make("appsrc", ”appsrc"); appsrc.setCaps(Caps.fromString(Caps.fromString(String.format("video/x-raw-yuv, width=%s, height=%s" + ", bpp=24, depth=16, framerate=%s/1",WIDTH+"", HEIGTH+"", CAPTURERATE+""))); final Element decodeBin = ElementFactory.make("decodebin2","decode"); final VideoComponent videoComponent = new VideoComponent(); pipe.addMany(appsrc, decodeBin); Element.linkMany(appsrc, decodeBin); decodeBin.connect(new DecodeBin2.NEW_DECODE_PAD()){ public void newDecodePad (DecodeBin2 elem, Pad pad, boolean last){ Caps cap = pad.getCaps(); Structure struct = caps.getStructure(0); if(struct.getName().startsWith(“video/”){ pad.link(videoComponent.getElement.getStaticPad(“Sink”)); } Snippet Sink Src Decode Bin

Gstreamer-java AppSrc Source App Source Element A callback function is used to put the frames into the AppSrc Element appsrc.set("emit-signals", true); appsrc.setSync(false); appsrc.connect(new AppSrc.NEED_DATA() { public void needData(AppSrc as, int size) { ApplicationPacket packet = Queue.dequeue(); while (packet == null) { System.out.println("Null data. trying again"); packet = Queue.dequeue(); } byte[] data = packet.getData(); Buffer buffer = new Buffer(data.length); buffer.getByteBuffer().put(data); as.pushBuffer(buffer); } {); Queue ArrayBlockingQueue Packet

MP2 Design Space Image/Video Capture Media Server Storage Transmission Compression Processing Audio/Video Presentation Playback Audio/Video Perception/ Playback Audio Information Representation Transmission Audio Capture A/V Playback Image/Video Information Representation

MP2: Audio Video Streaming Server Client Server –Capture Audio and Video at a fixed rate –Video: 30 fps, Audio: 8000Hz Client –Requests for Video and/or Audio –Works in Two modes: Active Mode and Passive Mode Control Channel Data Channel Active Mode: Media type: Audio, Video Video Rate: 15 to 25 fps Audio Rate: 8000Hz Video Resolution: 640X480 Passive Mode: Media type: Video Video Rate: 10 fps Video Resolution: 320X240

Video Data Flow Architecture: Server RTP Packet Rate Control Network Video SourceVideo Filter Media EncoderVideo Muxer App Sink Payload Media Type Timestamp RTP header ID UDP Packet Payload Segment

UDP Communication: Server Payload Media Type Timestamp RTP Packet ID UDP Packet Payloads PID SEQ SIZE PAYLOAD SIZE UDP Packet Payload System.arraycopy(pid_byte, 0, senddata, 0, pid_byte.length); System.arraycopy(seq_byte, 0, senddata, 4, seq_byte.length); System.arraycopy(size_byte, 0, senddata, 8, size_byte.length); While(true){ int remaining = (databyte.length - from); if(remaining < 0 ) break; else if(remaining > UDP_PAYLOAD_SIZE){ byte[] payload_size_byte = toBytes(UDP_PAYLOAD_SIZE); System.arraycopy(payload_size_byte, 0, senddata, 12, payload_size_byte.length); System.arraycopy(databyte, from, senddata, 16, UDP_PAYLOAD_SIZE); } else{ byte[] payload_size_byte = toBytes(remaining); System.arraycopy(payload_size_byte, 0, senddata, 12, payload_size_byte.length); System.arraycopy(databyte, from, senddata, 16, remaining); } sendData(7777, ”IP_Address", senddata); seq++; from += UDP_PACKET_SIZE; DatagramSocket DatagramPacket UDP Packet Header

Video Data Flow Architecture: Client Network RTP Packet App Source Decode Bin2 Video Sink Synchronization Payload Media Type Timestamp RTP header ID UDP Packet Payload Merge

UDP Communication: Client Payload Media Type Timestamp RTP Packet ID PID SEQ SIZE PAYLOAD SIZE UDP Packet Payload System.arraycopy(recv_data, 0, pid_byte, 0, 4); System.arraycopy(recv_data, 4, seq_byte, 0, 4); System.arraycopy(recv_data, 8, size_byte, 0, 4); System.arraycopy(recv_data, 12, payload_size_byte, 0, 4); HashTable index with PID SIZE if(!received_frame.containsKey(pid)){ ApplicationPacket app_frame = new ApplicationPacket(); app_frame.frame = new byte[size]; app_frame.array_size = size; System.arraycopy(recv_data, 16, app_frame.frame, seq*UDP_PACKET_SIZE, payload_size); received_frame.put(pid, app_frame); app_frame.array_size -= payload_size; }else{ ApplicationFrame app_frame = (ApplicationFrame) received_frame.get(pid); System.arraycopy(recv_data, 16, app_frame.frame, seq*UDP_PACKET_SIZE, payload_size); app_frame.array_size -= payload_size; received_frame.put(pid, app_frame); } if(app_frame.array_size == 0) finish_frame(pid); received_frame

Video Data Flow Architecture RTP Packet Rate Control Network Video SourceVideo Filter Media EncoderVideo Muxer App Sink RTP Packet App Source Decode Bin2 Video Sink Synchronization Payload Media Type Timestamp RTP header ID UDP Packet Payload Segment Merge

Rate Control: QoS Enforcement You can build leaky bucket You can implement token bucket You can use gstreamer rate control You can implement your own method Sleep (100) => 10 fps You can simply read using a loop While(true){ SendPacket(); Thread.sleep(10 0); }

Resource Admission: Client Client –Available Application Bandwidth AB N –Application Frame Size, M N = ? –Application Frame Rate, R N = ? –Audio Bandwidth = 8000 * 16 –Request Bandwidth Active Mode: B N = (M N * R N ) + Audio Bandwidth Passive Mode: B N = (M N * R N ) resource.txt Optimistic Allocation 8000Hz Audio Signal How to define R N for active mode? Show your computation on the Screen fps: 10

Resource Admission: Client Client –Available Application Bandwidth AB N –Application Frame Size, M N = ? –Application Frame Rate, R N = ? –Request Bandwidth Active Mode: B N = (M N * R N ) + Audio Bandwidth resource.txt Optimistic Allocation Show your computation on the Screen AB N = (M N *R N ) + Audio Bandwidth if R N > 25, R N = 25 if R N < 15, REJECT else R N

Resource Admission: Client Server –Available Application Bandwidth AB N –Used Bandwidth for Server 1 B 1 –Available Bandwidth AB N = AB N –B 1 –Admission is successful for B 2 if B 2 <= AB N resource.txt Server 160 Kbps Server 235 KBps Resource Table

Using Server Video SourceVideo Filter Tee Video ScaleScale Filter Video Rate Rate Filter Video Encoder RTP Payload RTP Bin UDP Sink Queue UDP Sink UDP Source RTCP RtpBin combines RtpSession RtpSsrcDemux RtpPtDesmux RtpJitterBuffer

Using Client Video SinkVideo DecoderRTP DePayload RTP Bin UDP Source UDP Sink RTCP rtpBin.connect(new Element.PAD_ADDED() { public void padAdded(Element element, Pad pad) { if (pad.getCaps().toString().contains("video")) { //LINK TO VIDEO SINK } else if (pad.getCaps().toString().contains("audio")) { // LINK TO AUDIO SINK} } });

Internals of RtpBin l

MP3: Multi-view Surveillance System Window for server 1 Window for server 2 Server –Capture Audio and Video at a fixed rate –Video: 30 fps, Audio: 8000Hz Client –Requests for Video and/or Audio –Works in Two modes: Active Mode and Passive Mode Active Mode: Media type: Audio, Video Video Rate: 15 to 25 fps Audio Rate: 8000Hz Video Resolution: 640X480 Passive Mode: Media type: Video Video Rate: 10 fps Video Resolution: 320X240

Additional Features for MP3? It is highly recommended that you build an additional feature in MP3 to SURPRISE US! Bonus Point: 20 Bonus point will be awarded based on the amount of surprise!