Download presentation
1
ANALOG TELEVISION
2
Persistence of vision:
the eye (or the brain rather) can retain the sensation of an image for a short time even after the actual image is removed. 1 Frame merging This allows the display of a video as successive frames as long as the frame interval is shorter than the persistence period, The eye will see a continuously varying image in time.
3
When the frame interval is too long, the eye observes frame flicker
When the frame interval is too long, the eye observes frame flicker. The minimal frame rate (frames/ second or fps or Hz) required to prevent frame flicker depends on display brightness, viewing distance. Higher frame rate is required with closer viewing and brighter display. For TV viewing: fps For Movie viewing: 24 fps For computer monitor: > 70 fps
4
2 Line merging As with frame merging, the eye can fuse separate lines into one complete frame, as long as the spacing between lines is sufficiently small. The maximum vertical spacing between lines depends on the viewing distance, the screen size, and the display brightness. For common viewing distance and TV screen size, lines per frame is acceptable
5
3 Merging pixels Similarly, the eye can fuse separate pixels in a line into one continuously varying line, as long as the spacing between pixels is sufficiently small.
6
4 Interlacing For some reason, the brighter the still image presented to the viewer ... the shorter the persistence of vision. If the space between pictures is longer than the period of persistence of vision then the image flickers. Therefore, to arrange for two "flashes" per frame, interlacing creates the flashes. The basic idea here is that a single frame is scanned twice. The first scan includes only the odd lines, the next scan includes only the even lines.
8
Basic black and white television
In a basic black and white TV, a single electron beam is used to scan a phosphor screen. The scan is interlaced, that is -- it scans twice per photographed frame. The information is always displayed from left to right. After each line is written, when the beam returns back to the left, the signal is blanked. When the signal reached the bottom it is blanked until it returns to the top to write the next line
9
Trace and Retrace
10
NTSC has 525 vertical lines
NTSC has 525 vertical lines. However lines number 248 to 263 and 511 to 525 are typically blanked to provide time for the beam to return to the upper left hand corner for the next scan. Notice that the beam does not return directly to the top, but zig-zags a bit.
13
Vertical Scanning signal
The vertical scanning signal for conventional black and white NTSC is quite straightforward. It is simply a positive ramp until it is time for the beam to return to the upper left-hand corner. Then it is a negative ramp during the blanked scan lines.
14
Horizontal Scan signal
The horizontal scan signal is very much the same. The horizontal scan rate is 525*29.97 or 15,734 Hz. Therefore, 63.6 uS are allocated per line. Typically about 10 uS of this is devoted to the blanking line on the horizontal scan. There are 427 pixels per horizontal scan line and so each pixel is scanned for approximately 125 ns.
15
The electron beam is analog modulated across the horizontal line
The electron beam is analog modulated across the horizontal line. The modulation then translates into intensity changes in electron beam and thus gray scale levels on the picture screen
16
Horizontal blanking signal and synchronization pulse is quite well defined. For black and white TV, the "front porch" is 0.02 times the distance between pulses, and the "back porch" is 0.06 times the distance between pulses.
17
The vertical blanking signal also has a number of synchronization pulses included in it. These are illustrated below.
18
The television bandwidth is 6 MHz.
The sub-carrier for the color is 3.58 MHz off the carrier for the monochrome information. The sound carrier is 4.5 MHz off the carrier for the monochrome information. There is a gap of 1.25 MHz on the low end and 0.25 MHz on the high end to avoid cross talk with other channels.
20
TV Transmitter (B&W)
21
TV Receiver (B&W)
22
COLOR TELEVISION One of the great electrical engineering triumphs was the development of color television in such a way that it remained compatible with black and white television. A major driving force behind the majority of current color TV standards was to allow black-and-white TVs to continue to be able to receive a valid TV signal after color service was in place.
23
Trireceptor theory of vision
why we use RGB monitors If you ask someone why red, green and blue are used in computer monitors -- the immediate answer is "Because these are the primary colors". If you then ask, "But why are these the primary colors?" -- the answer you get is that "If you mix light of these colors together you can make any color".
25
Color information transmission in TV
In the most basic form, color television could simply be implemented by having cameras with three filters (red, green and blue) and then transmitting the three color signals over wires to a receiver with three electron guns and three drive circuits. Unfortunately, this idealized view is not compatible with the previously allocated 6 MHz bandwidth of a TV channel. It is also not compatible with previously existing monochrome receivers.
26
Therefore, modern color TV is carefully structured to preserve all the original monochrome information -- and just add on the color information on top. To do this, one signal, called luminance (Y) has been chosen to occupy the major portion (0-4 MHz) of the channel. Y contains the brightness information and the detail. Y is the monochrome TV signal. Consider the model of a scene being filmed with three cameras. One camera has a red filter, one camera a green filter and one camera a blue filter.
27
Assume that the cameras all adjusted so that when pointed at "white" they each give equal voltages. To create the Y signal, the red, green and blue inputs to the Y signal must be balanced to compensate for the color perception misbalance of the eye. The governing equation is: For example, in order to produce "White" light to the human observer there needs to be 11 % blue, 30 % red and 59% green (=100%).
28
This is the "monochrome" part of the TV signal
This is the "monochrome" part of the TV signal. It officially takes up the first 4 MHz of the 6 MHz bandwidth of the TV signal. However, in practice, the signal is usually band-limited to 3.2 MHz. Two signals are then created to carry the chrominance (C) information. One of these signals is called "Q" and the other is called "I". They are related to the R, G and B signals by:
30
The positive polarity of Q is purple, the negative is green
The positive polarity of Q is purple, the negative is green. The positive polarity of I is orange, the negative is cyan. Thus, Q is often called the "green-purple" or "purple-green" axis information and I is often called the "orange-cyan" or "cyan-orange" axis information. It turns out that the human eye is more sensitive to spatial variations in the "orange-cyan" than it is for the "green purple". Thus, the "orange-cyan" or I signal has a maximum bandwidth of 1.5 MHz and the "green purple" only has a maximum bandwidth of 0.5 MHz.
31
Now, the Q and I signals are both modulated by a 3.58 MHz carrier wave. However, they are modulated out of 90 degrees out of phase.(QAM) These two signals are then summed together to make the C or chrominance signal. The nomenclature of the two signals aids in remembering what is going on. The I signal is In-phase with the 3.58 MHz carrier wave. The Q signal is in Quadrature (i.e. 1/4 of the way around the circle or 90 degrees out of phase, or orthogonal) with the 3.58 MHz carrier wave.
32
New chrominance signal (formed by Q and I) has the interesting property that the magnitude of the signal represents the color saturation, and the phase of the signal represents the hue. Phase = Arctan (Q/ I) = hue Magnitude = sqrt (I 2+ Q 2) = saturation Now, since the I and Q signals are clearly phase sensitive -- some sort of phase reference must be supplied. This reference is supplied after each horizontal scan and is included on the "back porch" of the horizontal sync pulse. The phase reference consists of 8-10 cycles of the 3.58 MHz signal. It is called the "color burst" and looks something like this
36
Conversion between RGB and YIQ
Y = R G B I = R G B Q = R G B R =1.0 Y I Q G = 1.0 Y I Q B =1.0 Y I Q
37
Bandwidth of Chrominance Signals
With real video signals, the chrominance component typically changes much slower than luminance Furthermore, the human eye is less sensitive to changes in chrominance than to changes in luminance The eye is more sensitive to the orange- cyan range (I) (the color of face!) than to green- purple range (Q) The above factors lead to I: bandlimitted to 1.5 MHz and Q: bandlimitted to 0.5 MHz
38
Multiplexing of Luminance and Chrominance
Position the bandlimited chrominance at the high end of the luminance spectrum, where the luminance is weak, but still sufficiently lower than the audio (at 4.5 MHz). The two chrominance components (I and Q) are multiplexed onto the same sub- carrier using QAM. The resulting video signal including the baseband luminance signal plus the chrominance components modulated to f c is called composite video signal.
40
In NTSC Luminance is AM VSB, the Chroma is QAM I&Q, and the Aural FM.
46
Transmitter Block Diagram
47
Color Decoder
48
Block diagrams of TV receivers
49
PAL , SECAM and NTSC There are three major TV standards used in the world today. These are the 1. American NTSC (National Television Systems Committee) color television system, 2. European PAL (Phase Alternation Line rate) 3. French-Former Soviet Union SECAM (Sequential Couleur avec Memoire)
51
The largest difference between the three systems is the vertical lines
The largest difference between the three systems is the vertical lines. NTSC uses 525 lines (interlaced) while both PAL and SECAM use 625 lines. NTSC frame rates are slightly less than 1/2 the 60 Hz power line frequency, while PAL and SECAM frame rates are exactly 1/2 the 50 Hz power line frequency. Lines a. lines v. resolution aspect h.resolution frame rate NTSC / PAL / SECAM /
53
Color Encoding Principles for the PAL
All three systems use the same definition for luminance: The color encoding principles for the PAL system are the same as those of the NTSC system -- with one minor difference. In the PAL system, the phase of the R-Y signal is reversed by 180 degrees from line to line. This is to reduce color errors that occur from amplitude and phase distortion of the color modulation sidebands during transmission.
54
Saying this more mathematically, the chrominance signal for NTSC transmission can be represented in terms of the R-Y and B-Y components as The PAL signal terms its B-Y component U and its R-Y component V and phase-flips the V component (line by line) as:
57
Color Encoding Principles for the SECAM
SECAM system differs very strongly from PAL and NTSC In SECAM the R-Y and B-Y signals are transmitted alternately every line. (The Y signal remains on for each line). Since there is an odd number of lines on any given scan, any line will have R-Y information on the first frame and B-Y on the second.
58
Furthermore, the R-Y and B-Y information is transmitted on different subcarriers. The B-Y sub-carrier runs at 4.25 MHz and the R-Y subcarrier runs at 4.4 MHz. In order to synchronize the line switching, alternate R-Y and B-Y sync signals are provided for nine lines during he vertical blanking interval following the equalizing pulses after the vertical sync.
59
Summary Television is the radio transmission of sound and pictures in the VHF and UHF ranges. The voice signal from a microphone is frequency-modulated. A camera converts a picture or scene into an electrical signal called the video or luminance Y signal, which amplitude-modulated Vestigial sideband AM is used to conserve spectrum space. The picture and sound transmitter frequencies are spaced 4.5 MHz apart, with the sound frequency being the higher.
60
TV cameras use either a vacuum tube imaging device such as a vidicon or a solid-state imaging device such as the charged-coupled device (CCD) to convert a scene into a video signal.
61
A scene is scanned by the imaging device to break it up into segments that can be transmitted serially. The National Television Standards Committee (NTSC) standards call for scanning the scene in two 262½ line fields, which are interlaced to form a single 525-line picture called a frame. Interlaced scanning reduces flicker. The field rate is Hz, and the frame or picture rate is Hz. The horizontal line scan rate is 15,734 Hz or 63.6 s per line.
62
The color in a scene is captured by three imaging devices, which break a picture down into its three basic colors of red, green, and blue using color light filters. Three-color signals are developed (R, G, B). These are combined in a resistive matrix to form the Y signal and are combined in other ways to form the I and Q signals. The I and Q signals amplitude-modulate 3.58-MHz subcarriers shifted 90 from one another in balanced modulators producing quadrature DSB suppressed signals that are added to form a carrier composite color signal. This color signal is then used to modulate the AM picture transmitter along with the Y signal. .
63
A TV receiver is a standard superheterodyne receiver with separate sections for processing and recovering the sound and picture. The tuner section consists of RF amplifiers, mixers, and a frequency-synthesized local oscillator for channel selection. Digital infrared remote control is used to change channels in the synthesizer via a control microprocessor.
64
The tuner converts the TV signals to intermediate frequencies of 41
The tuner converts the TV signals to intermediate frequencies of MHz for the sound and MHz for the picture. These signals are amplified in IF amplifiers. The sound and picture IF signals are placed in a sound detector to form a 4.5-MHz sound IF signal. This is demodulated by a quadrature detector or other FM demodulator to recover the sound. Frequency-multiplexing techniques similar to those used in FM radio are used for stereo TV sound. The picture IF is demodulated by a diode detector or other AM demodulator to recover the Y signal.
65
.The color signals are demodulated by two balanced modulators fed with 3.58-MHz subcarriers in quadrature. The subcarrier is frequency- and phase-locked to the subcarrier in the transmitter by phase-locking to the color subcarrier burst transmitted on the horizontal blanking pulse.
66
.To keep the receiver in step with the scanning process at the transmitter, sync pulses are transmitted along with the scanned lines of video. These sync pulses are stripped off the video detector and used to synchronize horizontal and vertical oscillators in the receiver. These oscillators generate deflection currents that sweep the electron beam in the picture tube to reproduce the picture.
67
.The color picture tube contains three electron guns that generate narrow electron beams aimed at the phosphor coating on the inside of the face of the picture tube. The phosphor is arranged in millions of tiny red, green, and blue color dot triads or stripes in proportion to their intensity and generate light of any color depending upon the amplitude of the red, green, and blue signals. The electron beam is scanned or deflected horizontally and vertically in step with the transmitted video signals. Deflection signals from the internal sweep circuits drive coils in a deflection yoke around the neck of the picture creating magnetic fields that sweep the three electron beams.
68
The horizontal output stage, which provides horizontal sweep, is also used to operate a flyback transformer that steps up the horizontal sync pulses to a very high voltage. These are rectified and filtered into a 30- to 35-kV voltage to operate the picture tube. The flyback also steps down the horizontal pulses and rectifies and filters them into low-voltage dc supplies that are used to operate most of the circuits in the
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.