EECS 150 - Components and Design Techniques for Digital Systems Lec 13 – Project Overview David Culler Electrical Engineering and Computer Sciences University.

Slides:



Advertisements
Similar presentations
HARDWARE Rashedul Hasan..
Advertisements

EECS150 Lab Lecture #61 AC97 PCM Audio EECS150 Fall 2007– Lab Lecture #6 Udam Saini 10/05/2007.
Motivation Application driven -- VoD, Information on Demand (WWW), education, telemedicine, videoconference, videophone Storage capacity Large capacity.
Fundamental concepts in video
EECS 150 Spring 2007 Checkpoint 0 - SDRAM 2/23/2007 Jeff Kalvass (Adapted From Greg Gibeling )
Video enhances, dramatizes, and gives impact to your multimedia application. Your audience will better understand the message of your application.
3/3/2006EECS150 Lab Lecture #71 Digital Video EECS150 Spring 2006 – Lab Lecture #7 Brian Gawalt Greg Gibeling.
CS Fall 2005 – Lec. #12: Course Project - 1 Electronic Etch-a-Sketch Project zProject Concept and Background zCheckpoint Structure zBells and Whistles.
Chapter 2 Computer Imaging Systems. Content Computer Imaging Systems.
SWE 423: Multimedia Systems Chapter 5: Video Technology (1)
10/10/2008EECS150 Lab Lecture #71 The Project & Digital Video EECS150 Fall Lab Lecture #7 Arjun Singh Adopted from slides designed by Greg Gibeling.
Comp :: Fall 2003 Video As A Datatype Ketan Mayer-Patel.
Sample rate conversion At times, it will be necessary to convert the sampling rate in a source signal to some other sampling rate Consider converting from.
ATSC Digital Television
Review on Networking Technologies Linda Wu (CMPT )
Fundamentals of Multimedia Chapter 5 Fundamental Concepts in Video Ze-Nian Li and Mark S. Drew 건국대학교 인터넷미디어공학부 임 창 훈.
CSc 461/561 CSc 461/561 Multimedia Systems Part A: 3. Video.
EECS Components and Design Techniques for Digital Systems Lec 11 – Project Introduction David Culler Electrical Engineering and Computer Sciences.
CS Spring 2007 – Lec. #11: Course Project - 1 Videoconferencing Project zProject Concept and Background zCheckpoint Structure zBells and Whistles.
The OSI and TCP/IP Models Last Update Copyright 2009 Kenneth M. Chipps Ph.D.
Video Monitor Uses raster scanning to display images
NTSC to VGA Converter Marco Moreno Adrian De La Rosa
IE433 CAD/CAM Computer Aided Design and Computer Aided Manufacturing Part-2 CAD Systems Industrial Engineering Department King Saud University.
 Chasis / System cabinet  A plastic enclosure that contains most of the components of a computer (usually excluding the display, keyboard and mouse)
Lecture 111 Lecture 11: Lab 3 Overview, the ADV7183B Video Decoder and the I 2 C Bus ECE 412: Microcomputer Laboratory.
Chapter 1 Overview Review Overview of demonstration network
Digilent System Board Capabilities Serial Port (RS-232) Parallel Port 1 Pushbutton Hint: Good for a reset button Connected to a clock input. See Digilent.
COE4OI5 Engineering Design Chapter 2: UP2/UP3 board.
Microcomputer Systems Project By Shriram Kunchanapalli.
Computer Processing of Data
CPU (CENTRAL PROCESSING UNIT): processor chip (computer’s brain) found on the motherboard.
Characteristics of Communication Systems
© 2007 Cisco Systems, Inc. All rights reserved.Cisco Public ITE PC v4.0 Chapter 1 1 Network Services Networking for Home and Small Businesses – Chapter.
10/12/2007EECS150 Lab Lecture #71 Digital Video and User Interface EECS150 Fall 2007 – Lab Lecture #7 Allen Lee Greg Gibeling.
© 2011 The McGraw-Hill Companies, Inc. All rights reserved Chapter 6: Video.
Lecture No. 3.  Screen resolution  Color  Blank space between the pixels  Intentional image degradation  Brightness  Contrast  Refresh rate  Sensitivity.
Video Monitor Uses raster scanning to display images –Beam of electrons illuminates phosphorus dots on the screen called pixels. Starting at the top of.
COP 5611 Operating Systems Spring 2010 Dan C. Marinescu Office: HEC 439 B Office hours: M-Wd 2:00-3:00 PM.
NETWORKING FUNDAMENTALS. Bandwidth Bandwidth is defined as the amount of information that can flow through a network connection in a given period of time.Bandwidth.
1 Multimedia Information Representation. 2 Analog Signals  Fourier transform and analysis Analog signal and frequency components Signal bandwidth and.
Video Video.
DIGITAL Video. Video Creation Video captures the real world therefore video cannot be created in the same sense that images can be created video must.
8279 KEYBOARD AND DISPLAY INTERFACING
Ch5: TELEVISION.
What Exactly is Television?  A process of transmitting images through a signal from one place or another.
1 Presented By: Eyal Enav and Tal Rath Eyal Enav and Tal Rath Supervisor: Mike Sumszyk Mike Sumszyk.
Computer Architecture Lecture 32 Fasih ur Rehman.
Digital Video Digital video is basically a sequence of digital images  Processing of digital video has much in common with digital image processing First.
IntroductiontMyn1 Introduction MPEG, Moving Picture Experts Group was started in 1988 as a working group within ISO/IEC with the aim of defining standards.
Renesas Electronics America Inc. © 2010 Renesas Electronics America Inc. All rights reserved. Overview of Ethernet Networking A Rev /31/2011.
1 Basics of Video Multimedia Systems (Module 1 Lesson 3) Summary: r Types of Video r Analog vs. Digital Video r Digital Video m Chroma Sub-sampling m HDTV.
Implementation of Pong over VGA on the Nexys 4 FPGA
High Definition Television. 2 Overview Technology advancements History Why HDTV? Current TV standards HDTV specifications Timeline Application Current.
Fundamental concepts in video
Networking for Home and Small Businesses – Chapter 6
Chapter 6: Video.
Networking for Home and Small Businesses – Chapter 6
Data Link Issues Relates to Lab 2.
Digital Video - Introduction
Graphics Hardware: Specialty Memories, Simple Framebuffers
Standards Presentation ECE 8873 – Data Compression and Modeling
Protocol layering and data
Local Video System: Overview
Networking for Home and Small Businesses – Chapter 6
Protocol layering and data
Digital Video - Introduction
Faculty of Science Information Technology Safeen Hasan Assist Lecturer
Presentation transcript:

EECS Components and Design Techniques for Digital Systems Lec 13 – Project Overview David Culler Electrical Engineering and Computer Sciences University of California, Berkeley

10/12/2004EECS 150, Fa04, Lec 13-Project 2 Traversing Digital Design EE 40 CS61C EECS150 wks 1-6 EECS150 wks You Are Here

10/12/2004EECS 150, Fa04, Lec 13-Project 3 Caveats Today’s lecture provides an overview of the project. Lab will cover it in MUCH more detail. Where there are differences, the lab information is correct! Names of many components are different from what is used in lab, so you won’t be confused…

10/12/2004EECS 150, Fa04, Lec 13-Project 4 Basic Pong Game composite video player-0 input player-1 input Court = set of obstacles –fixed position Paddle = moving obstacle –Position & vertical velocity –Function of joystick –P’ = f ( P, j ) Ball –2D position & velocity –[ spin, acc ] –Bounces off walls and paddles –B’ = f ( B, j,C ) Score –Ball hitting sides Effects –Display, audio, … 17 9

10/12/2004EECS 150, Fa04, Lec 13-Project 5 Calinx Board Flash Card & Micro-drive Port Video Encoder & Decoder AC ’97 Codec & Power Amp Video & Audio Ports Four 100 Mb Ethernet Ports 8 Meg x 32 SDRAM Quad Ethernet Transceiver Xilinx Virtex 2000E Seven Segment LED Displays Prototype Area

10/12/2004EECS 150, Fa04, Lec 13-Project 6 Add-on card

10/12/2004EECS 150, Fa04, Lec 13-Project 7 Project Design Problem Map this application 17 9 To this technology

10/12/2004EECS 150, Fa04, Lec 13-Project 8 Input-Output Constraints ADV7194 composite video ITU 601/656 N64 controller interface 8 player-0 input player-1 input FPGA Ball moves within a court Players control movement of the paddles with joysticks Observe game as rendered on video display Bounces off walls and paddles till point is scored I/O devices provide design constraints 17 9 switches LEDS LCD

10/12/2004EECS 150, Fa04, Lec 13-Project 9 Input/Output Support Game Physics Video Stream ADV7194 composite video ITU 601/656 N64 controller interface 8 player-0 input player-1 input Render Engine Joystick Interface FPGA Digitize and abstract messy analog input Rendering pipeline to translate display objects into byte stream Off-chip device to translate digital byte stream into composite video 17 9

10/12/2004EECS 150, Fa04, Lec 13-Project 10 “Physics” of the Game Game Physics Video Stream ADV7194 composite video ITU 601/656 N64 controller interface 8 player-0 input player-1 input Render Engine Joystick Interface FPGA Court = set of obstacles –fixed position Paddle = moving obstacle –Position & vertical velocity –Function of joystick –P’ = f ( P, j ) Ball –2D position & velocity –[ spin, acc ] –Bounces off walls and paddles –B’ = f ( B, j,C ) Score –Ball hitting sides Effects –Display, audio, … 17 9

10/12/2004EECS 150, Fa04, Lec 13-Project 11 Representing state State of the game –Court obstacles –Paddles –Ball –Score Additional data –Display blocks »Paddle & ball image –Numerals –Frame buffer SDRAM holds frame buffer –Rendered to frame buffer –Spooled to video encoder SDRAM has sophisticated interface –Grok Data sheet, design bus controller FPGA block RAM holds board –also Registers, Counters, … –Timing sequence, Controller state –FIFOs, Packet buffers Game Physics Video Stream ADV7194 composite video ITU 601/656 N64 controller interface 8 SDRAM Control Data 32 player-0 input board state player-1 input Render Engine SDRAM Control Joystick Interface FPGA 17 9 frame

10/12/2004EECS 150, Fa04, Lec 13-Project 12 N64 Interface (cp 1) Continually poll N64 and report state of buttons and analog joystick –Issue 8-bit command –Receive 32-bit response Each button response is 32 bit value containing button state and 8-bit signed horizontal and vertical velocities Serial interface protocol –Multiple cycles to perform each transaction Bits obtained serially –Framing (packet start/stop) –Bit encoding »start | data | data | stop Game Physics Video stream ADV7194 composite video ITU 601/656 8 SDRAM Control Data 32 player-0 input board state player-1 input Render Engine SDRAM Control Joystick Interface FPGA 17 9 N64 controller interface clock (27 MHz) reset start pause velocity 8 DQ

10/12/2004EECS 150, Fa04, Lec 13-Project 13 Video Encoder (cp 2) Rendering engine processes display objects into frame buffer –Renders rectangles, image blocks, … Drive ADV7194 video encoder device so that it outputs the correct NTSC Gain experience reading data sheets Dictates the 27 MHz operation rate –Used throughout graphics subsystem Game Physics Video Stream ADV7194 composite video ITU 601/656 N64 controller interface 8 SDRAM Control Data 32 player-0 input board state player-1 input Render Engine SDRAM Control Joystick Interface FPGA 17 9

10/12/2004EECS 150, Fa04, Lec 13-Project 14 Announcements Midterm will be returned in section Solutions available on-line Reading: –Video In a Nutshell (by Tom Oberheim) on class web page –Lab project documents (as assigned)

10/12/2004EECS 150, Fa04, Lec 13-Project 15 Digital Video Basics – a little detour Pixel Array: –A digital image is represented by a matrix of values where each value is a function of the information surrounding the corresponding point in the image. A single element in an image matrix is a picture element, or pixel. A pixel includes info for all color components. –The array size varies for different applications and costs. Some common sizes shown to the right. Frames: –The illusion of motion is created by successively flashing still pictures called frames.

10/12/2004EECS 150, Fa04, Lec 13-Project 16 Refresh Rates & Scaning The human perceptual system can be fooled into seeing continuous motion by flashing frames at a rate of around 20 frames/sec or higher. –Much lower and the movement looks jerky and flickers. TV in the US uses 30 frames/second (originally derived from the 60Hz line current frequency). Images are generated on the screen of the display device by “drawing” or scanning each line of the image one after another, usually from top to bottom. Early display devices (CRTs) required time to get from the end of a scan line to the beginning of the next. Therefore each line of video consists of an active video portion and a horizontal blanking interval portion. A vertical blanking interval corresponds to the time to return from the bottom to the top. –In addition to the active (visible) lines of video, each frame includes a number of non-visible lines in the vertical blanking interval. –The vertical blanking interval is used these days to send additional information such as closed captions and stock reports.

10/12/2004EECS 150, Fa04, Lec 13-Project 17 Interlaced Scanning Early inventers of TV discovered that they could reduce the flicker effect by increasing the flash-rate without increasing the frame-rate. Interlaced scanning forms a complete picture, the frame, from two fields, each comprising half the scan lines. The second field is delayed half the frame time from the first. Non-interlaced displays are call progressive scan. The first field, odd field, displays the odd scan lines, the second, even field, displays the even scan lines.

10/12/2004EECS 150, Fa04, Lec 13-Project 18 Pixel Components A natural way to represent the information at each pixel is with the brightness of each of the primary color components: red, green and blue (RBG). –In the digital domain we could transmit one number for each of red, green, and blue intensity. Engineers had to deal with issue when transitioning from black and white TV to color. The signal for black and white TV contains the overall pixel brightness (a combination of all color components). –Rather than adding three new signals for color TV, they decided to encode the color information in two extra signals to be used in conjunction with the B/W signal for color receivers and could be ignored for the older B/W sets. The color signals (components) are color differences, defined as: Y-B and Y-R, where Y is the brightness signal (component). In the digital domain the three components are called: Y luma, overall brightness C B chroma, Y-B C R chroma, Y-R Note that it is possible to reconstruct the RGB representation if needed. One reason this representation survives today is that the human visual perceptual system is less sensitive to spatial information in chrominance than it is in luminance. Therefore chroma components are usually subsampled with respect to luma component.

10/12/2004EECS 150, Fa04, Lec 13-Project 19 Chroma Subsampling Variations include subsampling horizontally, both vertically and horizontally. Chroma samples are coincident with alternate luma samples or are sited halfway between alternate luna samples.

10/12/2004EECS 150, Fa04, Lec 13-Project 20 Common Interchange Format (CIF) Developed for low to medium quality applications. Teleconferencing, etc. Variations: –QCIF, 4CIF, 16CIF Examples of component streaming: line i: Y C R Y Y C R Y Y… line i+1: Y C B Y Y C B Y Y… Alternate (different packet types): line i: Y C R Y C B Y C R Y C B Y … line i+1: Y Y Y Y Y … Bits/pixel: –6 components / 4 pixels –48/4 = 12 bits/pixel Example 1: commonly used as output of MPEG-1 decoders. Frame size352 x 288 Frame rate30 /sec Scanprogressive Chroma subsampling 4:2:0 2:1 in both X & Y Chroma alignment interstitial Bits per component 8 Effective bits/pixel 12

10/12/2004EECS 150, Fa04, Lec 13-Project 21 ITU-R BT.601 Format Formerly, CCIR-601. Designed for digitizing broadcast NTSC (national television system committee) signals. Variations: –4:2:0 –PAL (European) version Component streaming: line i: Y C B Y C R Y C B Y C R Y … line i+1: Y C B Y C R Y C B Y C R Y … Bits/pixel: –4 components / 2 pixels –40/2 = 20 bits/pixel The Calinx board video encoder supports this format. Frame size720 x 487 Frame rate29.97 /sec Scaninterlaced Chroma subsampling 4:2:2 2:1 in X only Chroma alignment coincident Bits per component 10 Effective bits/pixel 20

10/12/2004EECS 150, Fa04, Lec 13-Project 22 Calinx Video Encoder Analog Devices ADV7194 Supports: –Multiple input formats and outputs –Operational modes, slave/master –VidFX project will use default mode: ITU-601 as slave s-video output Digital input side connected to Virtex pins. Analog output side wired to on board connectors or headers. I 2 C interface for initialization: –Wired to Virtex.

10/12/2004EECS 150, Fa04, Lec 13-Project 23 ITU-R BT.656 Details Interfacing details for ITU-601. Pixels per line858 Lines per frame525 Frames/sec29.97 Pixels/sec13.5 M Viewable pixels/line720 Viewable lines/frame487 With 4:2:2 chroma sub-sampling need to send 2 words/pixel (1 Y and 1 C). words/sec = 27M, Therefore encoder runs off a 27MHz clock. Control information (horizontal and vertical synch) is multiplexed on the data lines. Encoder data stream show to right:

10/12/2004EECS 150, Fa04, Lec 13-Project 24 ITU-R BT.656 Details Control is provided through “End of Video” (EAV) and “Start of Video” (SAV) timing references. Each reference is a block of four words: FF, 00, 00, The word encodes the following bits: F = field select (even or odd) V = indicates vertical blanking H = 1 if EAV else 0 for SAV Horizontal blanking section consists of repeating pattern …

10/12/2004EECS 150, Fa04, Lec 13-Project 25 Calinx Video Decoder (not this term) Analog Devices ADV7185 Takes NTSC (or PAL) video signal on analog side and outputs ITU601/ITU656 on digital side. –Many modes and features not use by us. –VidFX project will use default mode: no initialization needed. Generates 27MHz clock synchronized to the output data. Digital input side connected to Virtex pins. Analog output side wired to on board connectors or headers. Camera connection through “composite video”. analog side digital side

10/12/2004EECS 150, Fa04, Lec 13-Project 26 SDRAM interface (cp 3) Memory protocols –Bus arbitration –Address phase –Data phase DRAM is large, but few address lines and slow –Row & col address –Wait states Synchronous DRAM provides fast synchronous access current block –Little like a cache in the DRAM –Fast burst of data Arbitration for shared resource Game Physics Video Stream ADV7194 composite video ITU 601/656 N64 controller interface 8 SDRAM Control Data 32 player-0 input board state player-1 input Render Engine SDRAM Control Joystick Interface FPGA 17 9 frame

10/12/2004EECS 150, Fa04, Lec 13-Project 27 SDRAM READ burst timing (for later)

10/12/2004EECS 150, Fa04, Lec 13-Project 28 Rendering Engine Fed series of display objects –Obstacles, paddles, ball –Each defined by bounding box »Top, bottom, left, right Renders object into frame buffer within that box –Bitblt color for rectangles –Copy pixel image Must arbitrate for SDRAM and carry out bus protocol Game Physics Video Enc. i/f ADV7194 composite video ITU 601/656 N64 controller interface 8 SDRAM Control Data 32 player-0 input board state player-1 input Render Engine SDRAM Control Joystick Interface FPGA 17 9

10/12/2004EECS 150, Fa04, Lec 13-Project 29 Game Physics Divide time into two major phases –Render –Compute new board Compute phase is divided into 255 ticks Each tick is small enough that paddles and board can only move a small amount –Makes fixed point arithmetic each –New paddle pos/vel based on old pos/vel and joystick velocity –New ball is based on old ball pos/vel and all collisions –Stream all obstacles and paddles by the ball next state logic to determine bounce Game Physics Video Encode ADV7194 composite video ITU 601/656 N64 controller interface 8 SDRAM Control Data 32 player-0 input board state player-1 input Render Engine SDRAM Control Joystick Interface FPGA 17 9 More when we look at arithmetic

10/12/2004EECS 150, Fa04, Lec 13-Project 30 Network Multiplayer Game ADV7194 composite video SDRAM Control Data 32 board state FPGA 17 9 ADV7194 composite video SDRAM Control Data 32 board state FPGA 17 9 network

10/12/2004EECS 150, Fa04, Lec 13-Project 31 Rendezvous & mode of operation Player with host device publishes channel and ID –Write it on the white board Set dip switches to select channel Start game as host –Wait for guest attach Start game as guest –Send out attach request Host compute all the game physics –Local joystick and remote network joystick as inputs »Receive Joystick movement packets and xlate to equivalent of local –Determines new ball and paddle position »Transmits court update packets –Network remote device must have fair service Both devices render display locally

10/12/2004EECS 150, Fa04, Lec 13-Project 32 Host Device (player 0) Game Physics Video Stream ADV7194 composite video ITU 601/656 Joystick Interface N64 controller interface 8 SDRAM Control SDRAM Control Data 32 Render Engine player-0 input board state player-1 input CC2420 Network Interface controller Board encoder Joystick decoder SPI

10/12/2004EECS 150, Fa04, Lec 13-Project 33 Guest Device (player 1) Video Stream ADV7194 composite video ITU 601/656 Joystick interface N64 controller interface 8 SDRAM Control SDRAM Control Data 32 Render Engine player-1 input board state CC2420 Network Interface controller Board decoder Joystick Encoder SPI

10/12/2004EECS 150, Fa04, Lec 13-Project 34 Protocol Stacks Usual case is that MAC protocol encapsulates IP (internet protocol) which in turn encapsulates TCP (transport control protocol) with in turn encapsulates the application layer. Each layer adds its own headers. Other protocols exist for other network services (ex: printers). When the reliability features (retransmission) of TCP are not needed, UDP/IP is used. Gaming and other applications where reliability is provided at the application layer. application layer ex: http TCP IP MAC Layer 2 Layer 3 Layer 4 Layer 5 Streaming Ex. Mpeg4 UDP IP MAC Layer 2 Layer 3 Layer 4 Layer 5

10/12/2004EECS 150, Fa04, Lec 13-Project 35 Standard Hardware-Network- Interface Usually divided into three hardware blocks. (Application level processing could be either hardware or software.) –MAG. “Magnetics” chip is a transformer for providing electrical isolation. –PHY. Provides serial/parallel and parallel/serial conversion and encodes bit-stream for Ethernet signaling convention. Drives/receives analog signals to/from MAG. Recovers clock signal from data input. –MAC. Media access layer processing. Processes Ethernet frames: preambles, headers, computes CRC to detect errors on receiving and to complete packet for transmission. Buffers (stores) data for/from application level. Application level interface –Could be a standard bus (ex: PCI) –or designed specifically for application level hardware. MII is an industry standard for connection PHY to MAC. MAG (transformer) PHY (Ethernet signal) MAC (MAC layer processing) application level interface Ethernet connection Media Independent Interface (MII) Calinx has no MAC chip, must be handled in FPGA. Calinx has no MAC chip, must be handled in FPGA. You have met ethernet. IEEE will look similar…yet different

10/12/2004EECS 150, Fa04, Lec 13-Project Frames

10/12/2004EECS 150, Fa04, Lec 13-Project 37 Packet protocols Framing definitions –IEEE Packet formats –Request game –Joystick packet –Board Packet

10/12/2004EECS 150, Fa04, Lec 13-Project 38 Schedule of checkpoints CP1: N64 interface (this week) CP2: Digital video encoder (week 8) CP3: SDRAM controller (two parts, week 9-10) CP4: IEEE (cc2420) interface (wk 11-12) –unless we bail out to ethernet –Overlaps with midterm II Project CP: game engine (wk 13-14) Endgame –11/29 early checkoff –12/6 final checkoff –12/10 project report due –12/15 midterm III