Video and Streaming Media Andy Dozier. Approach Video Standards – Analog Video – Digital Video Video Quality Parameters – Frame Rate – Color Depth – Resolution.

Slides:



Advertisements
Similar presentations
Chapter 8-Video.
Advertisements

Chapter 8-Video.
MULTIMEDIA TUTORIAL-II SHASHI BHUSHAN SOCIS, IGNOU.
Chapter 6 Review.
Digital Media Dr. Jim Rowan ITEC 2110 Video. Works because of persistence of vision Fusion frequency –~ 40 frames.
Motivation Application driven -- VoD, Information on Demand (WWW), education, telemedicine, videoconference, videophone Storage capacity Large capacity.
Fundamental concepts in video
Video enhances, dramatizes, and gives impact to your multimedia application. Your audience will better understand the message of your application.
Multimedia for the Web: Creating Digital Excitement Multimedia Element -- Graphics.
MPEG: A Video Compression Standard for Multimedia Applications Didier Le Gall Communications of the ACM Volume 34, Number 4 Pages 46-58, 1991.
MPEG: A Video Compression Standard for Multimedia Applications Didier Le Gall Communications of the ACM Volume 34, Number 4 Pages 46-58, 1991.
SWE 423: Multimedia Systems Chapter 5: Video Technology (1)
Comp :: Fall 2003 Video As A Datatype Ketan Mayer-Patel.
ATSC Digital Television
Multimedia Retrieval Architecture Anandi Giridharan Electrical Communication Engineering, Indian Institute of Science, Bangalore – , India Multimedia.
Fundamentals of Multimedia Chapter 5 Fundamental Concepts in Video Ze-Nian Li and Mark S. Drew 건국대학교 인터넷미디어공학부 임 창 훈.
1 Perception. 2 “The consciousness or awareness of objects or other data through the medium of the senses.”
Color of (digital image) Raed S. Rasheed Agenda Color. Color Image. Color Models – RGB color model. – CMYK color model. – HSV and HSL color model.
Image Formation and Digital Video
+ Video Compression Rudina Alhamzi, Danielle Guir, Scott Hansen, Joe Jiang, Jason Ostroski.
Digital Video An Introduction to the Digital Signal File Formats Acquisition IEEE 1394.
1 Video Processing CSC361/ Digital Media Spring 2004 Burg/Wong.
Video Media Department of Computer Education KMUTNB.
5.1 Video Concept Video is an excellent tool for delivering multimedia. Video places the highest performance demand on computer and its memory and storage.
Multimedia Specification Design and Production 2012 / Semester 1 / L2 Lecturer: Dr. Nikos Gazepidis
Digital Images The digital representation of visual information.
CS 1308 Computer Literacy and the Internet. Creating Digital Pictures  A traditional photograph is an analog representation of an image.  Digitizing.
ECE242 L30: Compression ECE 242 Data Structures Lecture 30 Data Compression.
CS Spring 2014 CS 414 – Multimedia Systems Design Lecture 5 – Digital Video Representation Klara Nahrstedt Spring 2014.
Digital Media Lecture 9: Video, TV & Film Georgia Gwinnett College School of Science and Technology Dr. Jim Rowan.
Digital Media Dr. Jim Rowan ITEC 2110 Video.
MPEG MPEG-VideoThis deals with the compression of video signals to about 1.5 Mbits/s; MPEG-AudioThis deals with the compression of digital audio signals.
Multimedia Data Video Compression The MPEG-1 Standard
LECTURE Copyright  1998, Texas Instruments Incorporated All Rights Reserved Encoding of Waveforms Encoding of Waveforms to Compress Information.
Week 5 Video on the Internet. 2 Overview Video & Internet: The problem Solutions & Technologies in use Video Compression Available products Future Direction.
© 2011 The McGraw-Hill Companies, Inc. All rights reserved Chapter 6: Video.
 Refers to sampling the gray/color level in the picture at MXN (M number of rows and N number of columns )array of points.  Once points are sampled,
WebCCTV 1 Contents Introduction Getting Started Connecting the WebCCTV NVR to a local network Connecting the WebCCTV NVR to the Internet Restoring the.
ITBIS351 Multimedia Systems and Hypermedia Yaqoob Al-Slaise
1 Multimedia Information Representation. 2 Analog Signals  Fourier transform and analysis Analog signal and frequency components Signal bandwidth and.
Video Video.
DIGITAL Video. Video Creation Video captures the real world therefore video cannot be created in the same sense that images can be created video must.
Concepts of Multimedia Processing and Transmission IT 481, Lecture 3 Dennis McCaughey, Ph.D. 5 February, 2007.
Video.
1 CP586 © Peter Lo 2003 Multimedia Communication Video Terminology.
1 Video v Video consists of image frames captured from real motion and shown in succession v Animation is similar except that the frames are synthesized.
Chapter 2 : Business Information Business Data Communications, 6e.
Data Compression. Compression? Compression refers to the ways in which the amount of data needed to store an image or other file can be reduced. This.
CIS679: Multimedia Basics r Multimedia data type r Basic compression techniques.
Image Compression Supervised By: Mr.Nael Alian Student: Anwaar Ahmed Abu-AlQomboz ID: IT College “Multimedia”
Ch5: TELEVISION.
What Exactly is Television?  A process of transmitting images through a signal from one place or another.
Digital Video Digital video is basically a sequence of digital images  Processing of digital video has much in common with digital image processing First.
IT2002 ATI Naiwala 1 By ATI Naiwala. IT2002 ATI Naiwala Combination of time Variant Image and Sound – Most realistic media Dynamic Huge data size(Very.
Multimedia – Digital Video Dr. Lina A. Nimri Lebanese University Faculty of Economic Sciences and Business Administration 1 st branch.
IntroductiontMyn1 Introduction MPEG, Moving Picture Experts Group was started in 1988 as a working group within ISO/IEC with the aim of defining standards.
1 Basics of Video Multimedia Systems (Module 1 Lesson 3) Summary: r Types of Video r Analog vs. Digital Video r Digital Video m Chroma Sub-sampling m HDTV.
IS502:M ULTIMEDIA D ESIGN FOR I NFORMATION S YSTEM F UNDAMENTAL CONCEPTS IN VIDEO Presenter Name: Mahmood A.Moneim Supervised By: Prof. Hesham A.Hefny.
Video Concepts and Techniques 1 SAMARTH COLLEGE OF ENGINEERING &TECHNOLOLOGY DEPARTMENT OF ELECTRONIC & COMMUNICATION ENGINEERING 5th semester (E&C) Subject.
Video System Dr inż. Zdzisław Pólkowski Badea George-Cosmin.
Digital Video Representation Subject : Audio And Video Systems Name : Makwana Gaurav Er no.: : Class : Electronics & Communication.
PRESENT BY:- DHVANI BHANKHAR RUCHA PATEL. INTRODUCTION  HD IS DESCRIBED FROM THE LATE 1930s.  HIGH DEFINITION TELEVISION.  DIGITAL TV BROAD CASTING.
Or, how to make it all fit! DIGITAL VIDEO FILES AND COMPRESSION STANDARDS.
Fundamental concepts in video
AMCOM Digital Archive Design Review - Week 4.
CSI-447 : Multimedia Systems
Understanding Analogue and Digital Video Lesson 1
"Digital Media Primer" Yue-Ling Wong, Copyright (c)2013 by Pearson Education, Inc. All rights reserved.
JPEG Image Coding Standard
Chapter 6: Video.
Presentation transcript:

Video and Streaming Media Andy Dozier

Approach Video Standards – Analog Video – Digital Video Video Quality Parameters – Frame Rate – Color Depth – Resolution Encoding/Decoding Standards

Video Standard Summary Analog Video – Composite – Component Digital Video

Composite Video Overview Optimized for wireless broadcast operation – Frequency allocations are controlled by the FCC – 54 MHz to 806 MHz (68 Channels) – Allocate 6 MHz/Channel Utilizes a single communication channel – Coaxial cable transmission – Terrestrial broadcast Lowest resolution

Composite Video Overview (cont’d) Defined by National Television Systems Committee (NTSC) – Interface Standard (System M-NTSC) documented in ANSI T M-NTSC Features – Color or monochrome – 30 frames/second – 525 horizontal scan lines (483 usable)

Interlacing A refresh rate of 30 frames/second exhibits flicker – One frame is a complete image at a point in time Solution is to divide each frame into two “fields” – One field consists of either all odd, or all even scan lines – Odd and even scan lines are “interlaced” horizontal scan lines/field – Each field is refreshed at a rate of 30/second 60 fields/second total Phosphor persistence allows the eye to perceive both fields at the same time – Eliminates flicker problem

Composite Video Resolution Horizontal/vertical dimension ratio is 4/3 Usable horizontal scan lines = 483 In order to make a horizontal line consistent in an image, it is necessary for the image line to cover more than one scan line – Number of horizontal image lines = 70% of the number of horizontal scan lines – Vertical resolution is 0.7 X 483, or 338 horizontal line/space pairs Require the same horizontal resolution – 4/3 X 338, or 450 vertical line/space pairs Composite Video Resolution is equivalent to 450 X 338 pixels

Composite Video Features: Single Wire or Channel NTSC Standard Suitable for Broadcasting Lowest Resolution Equivalent to 450 X 338 Pixels

Color Theory Color theory is based on the psychophysical properties of human color vision – First stated by Herman Grassman of Germany in 1854 Any color can be matched by an additive combination of different amounts of three additive primary colors – Additive primary colors are different from subtractive primary colors Red/Green/Blue (RGB) – In video, phosphors emit light, therefore we use additive primaries

Definitions Intrinsic nature of color is called Hue, or “U” Intensity of color is called Saturation, or “V” Hue and saturation taken together define color, or Chrominance, C – Hue + Saturation = Chrominance = C Brightness is described as Luminous Flux – Luminance = Y C and Y totally describe color sensation

Color Spatial Resolution For most images, the fine detail picked up by the human eye is conveyed by changes in Luminance – Cannot pick up color for small objects This implies that for very small areas in a scene, the human eye is much more sensitive to changes in Luminance, or brightness of the scene For large areas, the eye responds mostly to colors

Analog Component Video NTSC committee desired to design a color TV signal system that was compatible with the black and white (monochrome) system Split the signal into components – Luminance (Y) – Chrominance (C) This signal system accounts for the variation in sensitivity of the eye to different colors Y = 0.30 R G B

Analog Component Video (cont’d) A variety of signal systems are used to provide color displays Composite signal systems embed the Chrominance information into the transmitted signal Systems which separate the Y, C, U, and V information are referred to as Component Video systems – Digital and analog versions – Component video provides higher fidelity

Analog Component Video YUV Features: Separates Y, U, and V Current Color TV System Combine YUV for transmission Used for Color TV Receivers

Analog Component Video Y/C Features: Separates Y and C Intermediate Quality 2-wire system Called “S-Video” Used for Hand-Held Cameras Hi-8 Super VHS

Analog Component Video RGB Features: Separates R, G, and B signals Easily transformed into other signal systems Y/C YUV Used for Color Monitors

Digital Video Major disadvantages of analog techniques are: – Susceptibility to electromagnetic noise – Quality degrades with multiple generations of copies Digital video techniques represent component signals as streams of “1s” and “0s” – Eliminates degradation of multiple copy generations – Excellent noise immunity – Can be stored on hard disk drives, DVD, and CD-ROM – Can be transported via data networks

Digital Video Features Generated by digitizing analog video signals – Composite Digital - D2 Standard – Component Digital - D1 Standard Image quality is defined by three parameters – Frame Resolution and Scaling – Color Depth – Frame Rate

Frame Resolution and Scaling Each frame (image) is represented by an array of pixels If the pixel array is equal to the monitor resolution, the image fills the monitor screen – Example: 640 X 480 pixels Partial screen images may be displayed (scaled) Using a full screen resolution of 640 X 480 pixels: – 320 X 240 pixels would fill 1/4 of the screen – 160 X 120 pixels would fill 1/16 of the screen

Scaling of Image Size Full Screen1/4 Screen1/16 Screen

Color Depth Color depth is defined by the number of bits used to represent the color of each pixel This determines the maximum number of colors that can be represented, and therefore the “realism” of the image. As an example: Red = 8 bits/pixel Green = 8 bits/pixel Blue = 8 bits/pixel Using 24 bits/pixel allows representation of 16.7 Million colors

Frame Rate The number of times/second an image is refreshed controls image quality – Flicker – Jerkiness of motion Some encoding systems allow adjustment of the frame rate to stay within the bandwidth allocated by the network – Basic Rate ISDN allows a maximum of 128 kbps – Most high quality video conferencing systems use at least 384/kbps

Digital Video Bandwidth Requirements Consider the following: Frame Rate = 60 frames/second Color Depth = 24 bits/pixel Frame Size = 640 X 480 pixels This example would require Mbps to transmit uncompressed video in real time We have to consider compression techniques to transmit video for affordable systems

Digital Video Bandwidth Requirements Uncompressed D-1 video requires 270 Mbps This implies that it is still impossible to transport an uncompressed D-1 signal over the wide area – Bandwidth is too expensive It is also difficult to transport it over the local area – Requires Gigabit Ethernet

Video Stream Bandwidth

Intraframe Compression The eye is not as sensitive to changes in color on a small scale as intensity This implies that a video imaging system can “throw away” some of the color information in each frame, and still appear realistic to the human eye – Color sampling can be easily changed (sub-sampling) If this is done consistently for each frame, this technique is referred to as “Intraframe” compression

Intraframe Compression Color Subsampling The previous example would require 221 4:1:1

Alternative Intraframe Compression Techniques The key to successful intraframe techniques is that each frame be preserved at the highest resolution possible – Allows editing on a “frame by frame” basis The approach is to “throw away” information that cannot be perceived by the human eye by adjusting parameters

Alternative Intraframe Compression (cont’d)

JPEG The Joint Photographic Experts Group (JPEG) developed a compression standard for 24-bit “True Color” photographic images – Single frame encoding technology This technique utilizes Intraframe compression – Subsampling of Chroma information – Algorithm quantizes 8 X 8 blocks of pixels Achieves an image compression ratio of 2 to 30 over uncompressed images – One image equals one video frame

Motion JPEG Utilizes JPEG encoding for each frame – 30 frames/sec – Variable compression rations (2:1 to 30:1) This allows editing on a “frame by frame” basis – Industry standard for high definition storage and retrieval One drawback is that the MJPEG standard does not encode audio – Proprietary solution required One hour of broadcast video utilizing a 6:1 compression ratio requires 13 GBytes

Interframe Compression Significant compression must be achieved to transport and handle video streams via wide area networks (WANs) – Achieved by “Interframe” compression Adjustment of image parameters Data compression achieved by dropping information between frames Common interframe compression techniques available today: – MPEG

MPEG Compression In order to achieve significant compression ratios predictive techniques are required These techniques encode one complete frame periodically, and “predict” the changes between these “key frames” – MPEG encodes every 16 th frame Example: Talking head, where only the lips and head of the speaker are moving

MPEG Encoding Scheme

MPEG Disadvantages Since you have complete information every sixteen frames (~ every ½ second) video editing is more difficult Sound may need to be correlated to the frame of choice

Encoding Techniques Encoders are now available at reasonable prices that bring the compression ratios into an affordable range (< 1.5 Mbits/sec) Two types of encoders are available – Symmetric – Asymmetric Symmetric encoders can encode in real time – Used for video streaming applications Asymmetric encoders cannot encode in real time – Used for CD and DVD applications

Encoder/Decoder (Codec) Types

Streaming Video Originally, video was played via the “Download and Play” method For long video clips, it is more desirable to start playing before waiting for the entire file to download – Streaming video – Requires Isochronous playback – This is achieved by buffering

Download and Play

Isochronous Playback

Video Streaming

Video Editing and Authoring In order to create useful applications, it is necessary to capture multiple streams, and combine them into one Multiple rates may also be required for different users After the streams are captured, an “Editing and Authoring” process is required

Video Editing Process