Fundamental concepts in video

Slides:



Advertisements
Similar presentations
Chapter 8-Video.
Advertisements

Chapter 8-Video.
Communication Systems (EC-326)
Chapter 6 Review.
Motivation Application driven -- VoD, Information on Demand (WWW), education, telemedicine, videoconference, videophone Storage capacity Large capacity.
What We Must Understand
Video enhances, dramatizes, and gives impact to your multimedia application. Your audience will better understand the message of your application.
SWE 423: Multimedia Systems Chapter 5: Video Technology (1)
Comp :: Fall 2003 Video As A Datatype Ketan Mayer-Patel.
Sample rate conversion At times, it will be necessary to convert the sampling rate in a source signal to some other sampling rate Consider converting from.
Video Processing Wen-Hung Liao 6/2/2005
Fundamentals of Multimedia Chapter 5 Fundamental Concepts in Video Ze-Nian Li and Mark S. Drew 건국대학교 인터넷미디어공학부 임 창 훈.
CSc 461/561 CSc 461/561 Multimedia Systems Part A: 3. Video.
1 Computer Science 631 Lecture 5: From photons to pixels Ramin Zabih Computer Science Department CORNELL UNIVERSITY.
(Very)Basic Video Terminology Not exciting….but good to know.
1 CCTV SYSTEMS CCTV MONITORS. 2 CCTV SYSTEMS A monitor simply allows remote viewing of cameras in a CCTV system from a control room or other location.
1 CCTV SYSTEMS RESOLUTIONS USED IN CCTV. 2 CCTV SYSTEMS CCTV resolution is measured in vertical and horizontal pixel dimensions and typically limited.
Chapter 5 Fundamental Concepts in Video
Chapter 5 Fundamental Concepts in Video
Chapter 3 Fundamental Concepts in Video
Understanding Video.  Video Formats  Progressive vs. Interlaced  Video Image Sizes  Frame Rates  Video Outputs  Video as Digital Data  Compression.
HDMI (High-Definition Multimedia Interface) What is HDMI? Background Info Review Terminology Explain Operation Advantages.
Video Media Department of Computer Education KMUTNB.
5.1 Video Concept Video is an excellent tool for delivering multimedia. Video places the highest performance demand on computer and its memory and storage.
CS Spring 2014 CS 414 – Multimedia Systems Design Lecture 5 – Digital Video Representation Klara Nahrstedt Spring 2014.
Multimedia Data Video Compression The MPEG-1 Standard
Fundamentals of video.
Multimedia Basics (2) Hongli luo Fall 2010 CEIT, IPFW.
Copyright 1998, S.D. Personick. All Rights Reserved1 Telecommunications Networking I Lectures 2 & 3 Representing Information as a Signal.
Video and Streaming Media Andy Dozier. Approach Video Standards – Analog Video – Digital Video Video Quality Parameters – Frame Rate – Color Depth – Resolution.
© 2011 The McGraw-Hill Companies, Inc. All rights reserved Chapter 6: Video.
Video Production for Education & Training Bill Duff, Jr. Copyright 1999 College of Human Resources & Education West Virginia University.
Lecture No. 3.  Screen resolution  Color  Blank space between the pixels  Intentional image degradation  Brightness  Contrast  Refresh rate  Sensitivity.
 Refers to sampling the gray/color level in the picture at MXN (M number of rows and N number of columns )array of points.  Once points are sampled,
VIDEO FORMATS Prof Oakes. Compression CODECS COMPRESSOR/DECOMPRESSOR A codec provides specific instructions on how to compress video to reduce its size,
ITBIS351 Multimedia Systems and Hypermedia Yaqoob Al-Slaise
1 Multimedia Information Representation. 2 Analog Signals  Fourier transform and analysis Analog signal and frequency components Signal bandwidth and.
Video Video.
DIGITAL Video. Video Creation Video captures the real world therefore video cannot be created in the same sense that images can be created video must.
Concepts of Multimedia Processing and Transmission IT 481, Lecture 3 Dennis McCaughey, Ph.D. 5 February, 2007.
Video.
Videos Mei-Chen Yeh. Outline Video representation Basic video compression concepts – Motion estimation and compensation Some slides are modified from.
Chapter 2 : Business Information Business Data Communications, 6e.
Glossary of Digital Broadcast. Analog  A type of waveform signal that contains information such as image, voice, and data. Analog signals have unpredictable.
Ch5: TELEVISION.
What Exactly is Television?  A process of transmitting images through a signal from one place or another.
Digital Video Digital video is basically a sequence of digital images  Processing of digital video has much in common with digital image processing First.
NTSC SYSTEM. INTRODUCTION It’s the national television system committee. Analog television – used in parts of North and South America,Myanmar,S.Korea,parts.
Objective Understand concepts used to create digital video. Course Weight : 5%
C HAPTER 5: F UNDAMENTAL C ONCEPTS IN V IDEO 1. T YPES OF V IDEO S IGNALS Component video Higher-end video systems make use of three separate video signals.
Concepts Used to Create Digital Audio & Video Objectives &
1 Basics of Video Multimedia Systems (Module 1 Lesson 3) Summary: r Types of Video r Analog vs. Digital Video r Digital Video m Chroma Sub-sampling m HDTV.
IS502:M ULTIMEDIA D ESIGN FOR I NFORMATION S YSTEM F UNDAMENTAL CONCEPTS IN VIDEO Presenter Name: Mahmood A.Moneim Supervised By: Prof. Hesham A.Hefny.
Video Concepts and Techniques 1 SAMARTH COLLEGE OF ENGINEERING &TECHNOLOLOGY DEPARTMENT OF ELECTRONIC & COMMUNICATION ENGINEERING 5th semester (E&C) Subject.
Digital Video Representation Subject : Audio And Video Systems Name : Makwana Gaurav Er no.: : Class : Electronics & Communication.
Chapter 5 Fundamental Concepts in Video
Fundamental concepts in video
AMCOM Digital Archive Design Review - Week 4.
CSI-447 : Multimedia Systems
Understanding Analogue and Digital Video Lesson 1
"Digital Media Primer" Yue-Ling Wong, Copyright (c)2011 by Pearson Education, Inc. All rights reserved.
IPCOWALA INSTITUTE OF ENGINEERING & TECHNOLOGY-DHARMAJ
Chapter 5 Fundamental Concepts in Video
"Digital Media Primer" Yue-Ling Wong, Copyright (c)2013 by Pearson Education, Inc. All rights reserved.
"Digital Media Primer" Yue-Ling Wong, Copyright (c)2013 by Pearson Education, Inc. All rights reserved.
Chapter 6: Video.
Concepts in Video MMA Lecture 08 Prepared by Faraz khan.
Digital Image Processing
Chapter 6 Fundamentals of Digital Video
Faculty of Science Information Technology Safeen Hasan Assist Lecturer
Presentation transcript:

Fundamental concepts in video

Outline Types of Video Signals Analog Video Digital Video

Introduction This chapter introduce the principal notions needed to understand video. Digital video compression will be explored later.

Introduction Since video is created from a variety of sources, we begin with the signals themselves. Analog video is represented as a continuous (time- varying) signal. Digital video is represented as a sequence of digital images.

Video is the technology of electronically capturing, recording, processing, storing, transmitting, and reconstructing a sequence of still images representing scenes in motion.

Basic Concepts (Video Representation) Human eye views video immanent properties of the eye determine essential conditions related to video systems. Video signal representation consists of 3 aspects: Visual Representation objective is to offer the viewer a sense of presence in the scene and of participation in the events portrayed. Transmission Video signals are transmitted to the receiver through a single television channel Digitalization analog to digital conversion, sampling of gray(color) level, quantization.

aspect ratio Aspect ratio describes the dimensions of video screens and video picture elements. All popular video formats are rectilinear, and so can be described by a ratio between width and height. The screen aspect ratio of a traditional television screen is 4:3. High definition televisions use an aspect ratio of 16:9. Video

Chrominance Chrominance (chroma for short), is the signal used in video systems to convey the color information of the picture, separately from the accompanying luma signal. Chrominance is usually represented as two color-difference components: U = B'–Y' (blue – luma) and V = R'–Y' (red – luma). Each of these difference components may have scale factors and offsets applied to them, as specified by the applicable video standard. luma represents the brightness in an image Video

Types of Video Signals Video signals can be organized in three different ways: component video, composite video, and S-video. Component video In popular use, it refers to a type of analog video information that is transmitted or stored as three separate signals for the red, green, and blue image planes. Each color channel is sent as a separate video signal. This kind of system has three kind wires (and connectors) connecting the camera or other devices to a TV or monitor. Most computer systems use Component Video, with separate signals for R, G,and B signals. For any color separation scheme, Component Video gives the best color reproduction since there is no “crosstalk” between the three channels. This is not the case for S-Video or Composite Video, discussed next. Component video, however, requires more bandwidth and good synchronization of the three components.

Component video

Composite Video — 1 Signal Composite video: color (“chrominance”) and intensity (“luminance”) signals are mixed into a single carrier wave. This type of signal used by broadcast color TVs; it is downward compatible with black-and-white TV. When connecting to TVs, Composite Video uses only one wire and video color signals are mixed, not sent separately. The audio and sync signals are additions to this one signal. Since color and intensity are wrapped into the same signal, some interference between the luminance and chrominance signals is inevitable

S-Video (separate video) — 2 Signals S-video as a compromise,) uses two wires, one for luminance and another for a composite chrominance signal. As a result, there is less crosstalk between the color information and the crucial gray-scale information. The reason for placing luminance into its own part of the signal is that black-and-white information is most crucial for visual perception. In fact, humans are able to differentiate spatial resolution in grayscale images with a much higher acuity than for the color part of color images. As a result, we can send less accurate color information than must be sent for intensity information — we can only see fairly large blobs of color, so it makes sense to send less color detail.

Analog Video Most TV is still sent and received as analog signal. An analog signal f(t) samples a time-varying image. So-called “progressive” scanning traces through a complete picture (a frame) row-wise for each time interval. A high resolution computer monitor typically uses a time interval of 1/72 second. In TV, and in some monitors and multimedia standards as well, another system, called “interlaced” scanning is used: The odd-numbered lines are traced first, and then the even- numbered lines are traced. This results in “odd” and “even” fields —two fields make up one frame. In fact, the odd lines (starting from 1) end up at the middle of a line at the end of the odd field, and the even scan starts at a half-way point.

Analog Video Figure 5.1 shows the scheme used. First the solid (odd) lines are traced, P to Q, then R to S, etc., ending at T; then the even field starts at U and ends at V. The jump from Q to R, etc. in Figure 5.1 is called the horizontal retrace, during which the electronic beam in the CRT is blanked. The jump from T to U or V to P is called the vertical retrace. The scan lines are not horizontal because a small voltage is applied, moving the electron bean down over time.

Analog Video Interlacing was invented because, when standards were being defined, it was difficult to transmit the amount of information in a full frame quickly enough to avoid flicker, the double number of fields presented to the eye reduces the eye perceived flicker. Because of interlacing, the odd and even lines are displaced in time from each other —generally not noticeable except when very fast action is taking place on screen, when blurring may occur. Since it is sometimes necessary to change the frame rate, resize, or even produce stills from an interlaced source video, various schemes are used to “de-interlace” it.

Analog Video The simplest de-interlacing method consists of discarding one field and duplicating the scan lines of the other field. The information in one field is lost completely using this simple technique. Other more complicated methods that retain information from both fields are also possible.

NTSC Video NTSC (National Television System Committee) TV standard is mostly used in North America and Japan. It uses the familiar 4:3 aspect ratio (i.e., the ratio of picture width to its height) and uses 525 scan lines per frame at 30 frames per second (fps). The problem is that NTSC is an analog system. In computer video, colors and brightness are represented by numbers (digital). But with analog television, everything is just voltages, and voltages are affected by wire length, connectors, heat, cold, video tape, and so on. NTSC follows the interlaced scanning system, and each frame is divided into two fields, with 262.5 lines/field. Thus the horizontal sweep frequency is 525×29.97 ≈ 15, 734 lines/sec,

PAL Video PAL (Phase Alternating Line) is a TV standard widely used in Western Europe, China, India, and many other parts of the world. PAL uses 625 scan lines per frame, at 25 frames/second, with a 4:3 aspect ratio and interlaced fields.

Digital video The advantages of digital representation for video are many, For example: Video can be stored on digital devices or in memory, ready to be processed (noise removal, cut and paste, etc.), and integrated to various multimedia applications; Direct access is possible, which makes nonlinear video editing achievable as a simple, rather than a complex, task; Repeated recording does not degrade image quality. Ease of encryption and better tolerance to channel noise

Chroma Subsampling Chroma subsampling is the practice of encoding images by implementing less resolution for chroma information than for luma information. It is used in many video encoding schemes — both analog and digital Because of storage and transmission limitations, there is always a desire to reduce (or compress) the signal. Since the human visual system is much more sensitive to variations in brightness than color, a video system can be optimized by devoting more bandwidth to the luma component (usually denoted Y'), than to the color difference components Cb and Cr. The signal is divided into a luma (Y') component and two color difference components (chroma)

Chroma Subsampling

CCIR Standards for Digital Video CCIR is the Consultative Committee for International Radio, one of the most important standards it has produced is CCIR-601, for component digital video. Table 5.3 shows some of the digital video specifications, all with an aspect ratio of 4:3. The CCIR 601 standard uses an interlaced scan, so each field has only half as much vertical resolution

CCIR Standards for Digital Video

CCIR Standards for Digital Video CIF stands for Common Intermediate Format specified by the CCITT (International Telegraph and Telephone Consultative Committee). The idea of CIF is to specify a format for lower bitrate. QCIF stands for “Quarter-CIF”.

High definition TV (HDTV) refers to video having resolution substantially higher than traditional television systems. HD has one or two million pixels per frame. The first generation of HDTV was based on an analog technology developed by Sony in Japan in the late 1970s.