Presentation is loading. Please wait.

Presentation is loading. Please wait.

EECS 150 - Components and Design Techniques for Digital Systems Lec 13 – Project Overview David Culler Electrical Engineering and Computer Sciences University.

Similar presentations


Presentation on theme: "EECS 150 - Components and Design Techniques for Digital Systems Lec 13 – Project Overview David Culler Electrical Engineering and Computer Sciences University."— Presentation transcript:

1 EECS 150 - Components and Design Techniques for Digital Systems Lec 13 – Project Overview David Culler Electrical Engineering and Computer Sciences University of California, Berkeley http://www.eecs.berkeley.edu/~culler http://www-inst.eecs.berkeley.edu/~cs150

2 10/12/2004EECS 150, Fa04, Lec 13-Project 2 Traversing Digital Design EE 40 CS61C EECS150 wks 1-6 EECS150 wks 6 - 15 You Are Here

3 10/12/2004EECS 150, Fa04, Lec 13-Project 3 Caveats Today’s lecture provides an overview of the project. Lab will cover it in MUCH more detail. Where there are differences, the lab information is correct! Names of many components are different from what is used in lab, so you won’t be confused…

4 10/12/2004EECS 150, Fa04, Lec 13-Project 4 Basic Pong Game composite video player-0 input player-1 input Court = set of obstacles –fixed position Paddle = moving obstacle –Position & vertical velocity –Function of joystick –P’ = f ( P, j ) Ball –2D position & velocity –[ spin, acc ] –Bounces off walls and paddles –B’ = f ( B, j,C ) Score –Ball hitting sides Effects –Display, audio, … 17 9

5 10/12/2004EECS 150, Fa04, Lec 13-Project 5 Calinx Board Flash Card & Micro-drive Port Video Encoder & Decoder AC ’97 Codec & Power Amp Video & Audio Ports Four 100 Mb Ethernet Ports 8 Meg x 32 SDRAM Quad Ethernet Transceiver Xilinx Virtex 2000E Seven Segment LED Displays Prototype Area

6 10/12/2004EECS 150, Fa04, Lec 13-Project 6 Add-on card

7 10/12/2004EECS 150, Fa04, Lec 13-Project 7 Project Design Problem Map this application 17 9 To this technology

8 10/12/2004EECS 150, Fa04, Lec 13-Project 8 Input-Output Constraints ADV7194 composite video ITU 601/656 N64 controller interface 8 player-0 input player-1 input FPGA Ball moves within a court Players control movement of the paddles with joysticks Observe game as rendered on video display Bounces off walls and paddles till point is scored I/O devices provide design constraints 17 9 switches LEDS LCD

9 10/12/2004EECS 150, Fa04, Lec 13-Project 9 Input/Output Support Game Physics Video Stream ADV7194 composite video ITU 601/656 N64 controller interface 8 player-0 input player-1 input Render Engine Joystick Interface FPGA Digitize and abstract messy analog input Rendering pipeline to translate display objects into byte stream Off-chip device to translate digital byte stream into composite video 17 9

10 10/12/2004EECS 150, Fa04, Lec 13-Project 10 “Physics” of the Game Game Physics Video Stream ADV7194 composite video ITU 601/656 N64 controller interface 8 player-0 input player-1 input Render Engine Joystick Interface FPGA Court = set of obstacles –fixed position Paddle = moving obstacle –Position & vertical velocity –Function of joystick –P’ = f ( P, j ) Ball –2D position & velocity –[ spin, acc ] –Bounces off walls and paddles –B’ = f ( B, j,C ) Score –Ball hitting sides Effects –Display, audio, … 17 9

11 10/12/2004EECS 150, Fa04, Lec 13-Project 11 Representing state State of the game –Court obstacles –Paddles –Ball –Score Additional data –Display blocks »Paddle & ball image –Numerals –Frame buffer SDRAM holds frame buffer –Rendered to frame buffer –Spooled to video encoder SDRAM has sophisticated interface –Grok Data sheet, design bus controller FPGA block RAM holds board –also Registers, Counters, … –Timing sequence, Controller state –FIFOs, Packet buffers Game Physics Video Stream ADV7194 composite video ITU 601/656 N64 controller interface 8 SDRAM Control Data 32 player-0 input board state player-1 input Render Engine SDRAM Control Joystick Interface FPGA 17 9 frame

12 10/12/2004EECS 150, Fa04, Lec 13-Project 12 N64 Interface (cp 1) Continually poll N64 and report state of buttons and analog joystick –Issue 8-bit command –Receive 32-bit response Each button response is 32 bit value containing button state and 8-bit signed horizontal and vertical velocities Serial interface protocol –Multiple cycles to perform each transaction Bits obtained serially –Framing (packet start/stop) –Bit encoding »start | data | data | stop Game Physics Video stream ADV7194 composite video ITU 601/656 8 SDRAM Control Data 32 player-0 input board state player-1 input Render Engine SDRAM Control Joystick Interface FPGA 17 9 N64 controller interface clock (27 MHz) reset start pause velocity 8 DQ

13 10/12/2004EECS 150, Fa04, Lec 13-Project 13 Video Encoder (cp 2) Rendering engine processes display objects into frame buffer –Renders rectangles, image blocks, … Drive ADV7194 video encoder device so that it outputs the correct NTSC Gain experience reading data sheets Dictates the 27 MHz operation rate –Used throughout graphics subsystem Game Physics Video Stream ADV7194 composite video ITU 601/656 N64 controller interface 8 SDRAM Control Data 32 player-0 input board state player-1 input Render Engine SDRAM Control Joystick Interface FPGA 17 9

14 10/12/2004EECS 150, Fa04, Lec 13-Project 14 Announcements Midterm will be returned in section Solutions available on-line Reading: –Video In a Nutshell (by Tom Oberheim) on class web page –Lab project documents (as assigned)

15 10/12/2004EECS 150, Fa04, Lec 13-Project 15 Digital Video Basics – a little detour Pixel Array: –A digital image is represented by a matrix of values where each value is a function of the information surrounding the corresponding point in the image. A single element in an image matrix is a picture element, or pixel. A pixel includes info for all color components. –The array size varies for different applications and costs. Some common sizes shown to the right. Frames: –The illusion of motion is created by successively flashing still pictures called frames.

16 10/12/2004EECS 150, Fa04, Lec 13-Project 16 Refresh Rates & Scaning The human perceptual system can be fooled into seeing continuous motion by flashing frames at a rate of around 20 frames/sec or higher. –Much lower and the movement looks jerky and flickers. TV in the US uses 30 frames/second (originally derived from the 60Hz line current frequency). Images are generated on the screen of the display device by “drawing” or scanning each line of the image one after another, usually from top to bottom. Early display devices (CRTs) required time to get from the end of a scan line to the beginning of the next. Therefore each line of video consists of an active video portion and a horizontal blanking interval portion. A vertical blanking interval corresponds to the time to return from the bottom to the top. –In addition to the active (visible) lines of video, each frame includes a number of non-visible lines in the vertical blanking interval. –The vertical blanking interval is used these days to send additional information such as closed captions and stock reports.

17 10/12/2004EECS 150, Fa04, Lec 13-Project 17 Interlaced Scanning Early inventers of TV discovered that they could reduce the flicker effect by increasing the flash-rate without increasing the frame-rate. Interlaced scanning forms a complete picture, the frame, from two fields, each comprising half the scan lines. The second field is delayed half the frame time from the first. Non-interlaced displays are call progressive scan. The first field, odd field, displays the odd scan lines, the second, even field, displays the even scan lines.

18 10/12/2004EECS 150, Fa04, Lec 13-Project 18 Pixel Components A natural way to represent the information at each pixel is with the brightness of each of the primary color components: red, green and blue (RBG). –In the digital domain we could transmit one number for each of red, green, and blue intensity. Engineers had to deal with issue when transitioning from black and white TV to color. The signal for black and white TV contains the overall pixel brightness (a combination of all color components). –Rather than adding three new signals for color TV, they decided to encode the color information in two extra signals to be used in conjunction with the B/W signal for color receivers and could be ignored for the older B/W sets. The color signals (components) are color differences, defined as: Y-B and Y-R, where Y is the brightness signal (component). In the digital domain the three components are called: Y luma, overall brightness C B chroma, Y-B C R chroma, Y-R Note that it is possible to reconstruct the RGB representation if needed. One reason this representation survives today is that the human visual perceptual system is less sensitive to spatial information in chrominance than it is in luminance. Therefore chroma components are usually subsampled with respect to luma component.

19 10/12/2004EECS 150, Fa04, Lec 13-Project 19 Chroma Subsampling Variations include subsampling horizontally, both vertically and horizontally. Chroma samples are coincident with alternate luma samples or are sited halfway between alternate luna samples.

20 10/12/2004EECS 150, Fa04, Lec 13-Project 20 Common Interchange Format (CIF) Developed for low to medium quality applications. Teleconferencing, etc. Variations: –QCIF, 4CIF, 16CIF Examples of component streaming: line i: Y C R Y Y C R Y Y… line i+1: Y C B Y Y C B Y Y… Alternate (different packet types): line i: Y C R Y C B Y C R Y C B Y … line i+1: Y Y Y Y Y … Bits/pixel: –6 components / 4 pixels –48/4 = 12 bits/pixel Example 1: commonly used as output of MPEG-1 decoders. Frame size352 x 288 Frame rate30 /sec Scanprogressive Chroma subsampling 4:2:0 2:1 in both X & Y Chroma alignment interstitial Bits per component 8 Effective bits/pixel 12

21 10/12/2004EECS 150, Fa04, Lec 13-Project 21 ITU-R BT.601 Format Formerly, CCIR-601. Designed for digitizing broadcast NTSC (national television system committee) signals. Variations: –4:2:0 –PAL (European) version Component streaming: line i: Y C B Y C R Y C B Y C R Y … line i+1: Y C B Y C R Y C B Y C R Y … Bits/pixel: –4 components / 2 pixels –40/2 = 20 bits/pixel The Calinx board video encoder supports this format. Frame size720 x 487 Frame rate29.97 /sec Scaninterlaced Chroma subsampling 4:2:2 2:1 in X only Chroma alignment coincident Bits per component 10 Effective bits/pixel 20

22 10/12/2004EECS 150, Fa04, Lec 13-Project 22 Calinx Video Encoder Analog Devices ADV7194 Supports: –Multiple input formats and outputs –Operational modes, slave/master –VidFX project will use default mode: ITU-601 as slave s-video output Digital input side connected to Virtex pins. Analog output side wired to on board connectors or headers. I 2 C interface for initialization: –Wired to Virtex.

23 10/12/2004EECS 150, Fa04, Lec 13-Project 23 ITU-R BT.656 Details Interfacing details for ITU-601. Pixels per line858 Lines per frame525 Frames/sec29.97 Pixels/sec13.5 M Viewable pixels/line720 Viewable lines/frame487 With 4:2:2 chroma sub-sampling need to send 2 words/pixel (1 Y and 1 C). words/sec = 27M, Therefore encoder runs off a 27MHz clock. Control information (horizontal and vertical synch) is multiplexed on the data lines. Encoder data stream show to right:

24 10/12/2004EECS 150, Fa04, Lec 13-Project 24 ITU-R BT.656 Details Control is provided through “End of Video” (EAV) and “Start of Video” (SAV) timing references. Each reference is a block of four words: FF, 00, 00, The word encodes the following bits: F = field select (even or odd) V = indicates vertical blanking H = 1 if EAV else 0 for SAV Horizontal blanking section consists of repeating pattern 80 10 80 10 …

25 10/12/2004EECS 150, Fa04, Lec 13-Project 25 Calinx Video Decoder (not this term) Analog Devices ADV7185 Takes NTSC (or PAL) video signal on analog side and outputs ITU601/ITU656 on digital side. –Many modes and features not use by us. –VidFX project will use default mode: no initialization needed. Generates 27MHz clock synchronized to the output data. Digital input side connected to Virtex pins. Analog output side wired to on board connectors or headers. Camera connection through “composite video”. analog side digital side

26 10/12/2004EECS 150, Fa04, Lec 13-Project 26 SDRAM interface (cp 3) Memory protocols –Bus arbitration –Address phase –Data phase DRAM is large, but few address lines and slow –Row & col address –Wait states Synchronous DRAM provides fast synchronous access current block –Little like a cache in the DRAM –Fast burst of data Arbitration for shared resource Game Physics Video Stream ADV7194 composite video ITU 601/656 N64 controller interface 8 SDRAM Control Data 32 player-0 input board state player-1 input Render Engine SDRAM Control Joystick Interface FPGA 17 9 frame

27 10/12/2004EECS 150, Fa04, Lec 13-Project 27 SDRAM READ burst timing (for later)

28 10/12/2004EECS 150, Fa04, Lec 13-Project 28 Rendering Engine Fed series of display objects –Obstacles, paddles, ball –Each defined by bounding box »Top, bottom, left, right Renders object into frame buffer within that box –Bitblt color for rectangles –Copy pixel image Must arbitrate for SDRAM and carry out bus protocol Game Physics Video Enc. i/f ADV7194 composite video ITU 601/656 N64 controller interface 8 SDRAM Control Data 32 player-0 input board state player-1 input Render Engine SDRAM Control Joystick Interface FPGA 17 9

29 10/12/2004EECS 150, Fa04, Lec 13-Project 29 Game Physics Divide time into two major phases –Render –Compute new board Compute phase is divided into 255 ticks Each tick is small enough that paddles and board can only move a small amount –Makes fixed point arithmetic each –New paddle pos/vel based on old pos/vel and joystick velocity –New ball is based on old ball pos/vel and all collisions –Stream all obstacles and paddles by the ball next state logic to determine bounce Game Physics Video Encode ADV7194 composite video ITU 601/656 N64 controller interface 8 SDRAM Control Data 32 player-0 input board state player-1 input Render Engine SDRAM Control Joystick Interface FPGA 17 9 More when we look at arithmetic

30 10/12/2004EECS 150, Fa04, Lec 13-Project 30 Network Multiplayer Game ADV7194 composite video SDRAM Control Data 32 board state FPGA 17 9 ADV7194 composite video SDRAM Control Data 32 board state FPGA 17 9 network

31 10/12/2004EECS 150, Fa04, Lec 13-Project 31 Rendezvous & mode of operation Player with host device publishes channel and ID –Write it on the white board Set dip switches to select channel Start game as host –Wait for guest attach Start game as guest –Send out attach request Host compute all the game physics –Local joystick and remote network joystick as inputs »Receive Joystick movement packets and xlate to equivalent of local –Determines new ball and paddle position »Transmits court update packets –Network remote device must have fair service Both devices render display locally

32 10/12/2004EECS 150, Fa04, Lec 13-Project 32 Host Device (player 0) Game Physics Video Stream ADV7194 composite video ITU 601/656 Joystick Interface N64 controller interface 8 SDRAM Control SDRAM Control Data 32 Render Engine player-0 input board state player-1 input CC2420 Network Interface controller Board encoder Joystick decoder SPI

33 10/12/2004EECS 150, Fa04, Lec 13-Project 33 Guest Device (player 1) Video Stream ADV7194 composite video ITU 601/656 Joystick interface N64 controller interface 8 SDRAM Control SDRAM Control Data 32 Render Engine player-1 input board state CC2420 Network Interface controller Board decoder Joystick Encoder SPI

34 10/12/2004EECS 150, Fa04, Lec 13-Project 34 Protocol Stacks Usual case is that MAC protocol encapsulates IP (internet protocol) which in turn encapsulates TCP (transport control protocol) with in turn encapsulates the application layer. Each layer adds its own headers. Other protocols exist for other network services (ex: printers). When the reliability features (retransmission) of TCP are not needed, UDP/IP is used. Gaming and other applications where reliability is provided at the application layer. application layer ex: http TCP IP MAC Layer 2 Layer 3 Layer 4 Layer 5 Streaming Ex. Mpeg4 UDP IP MAC Layer 2 Layer 3 Layer 4 Layer 5

35 10/12/2004EECS 150, Fa04, Lec 13-Project 35 Standard Hardware-Network- Interface Usually divided into three hardware blocks. (Application level processing could be either hardware or software.) –MAG. “Magnetics” chip is a transformer for providing electrical isolation. –PHY. Provides serial/parallel and parallel/serial conversion and encodes bit-stream for Ethernet signaling convention. Drives/receives analog signals to/from MAG. Recovers clock signal from data input. –MAC. Media access layer processing. Processes Ethernet frames: preambles, headers, computes CRC to detect errors on receiving and to complete packet for transmission. Buffers (stores) data for/from application level. Application level interface –Could be a standard bus (ex: PCI) –or designed specifically for application level hardware. MII is an industry standard for connection PHY to MAC. MAG (transformer) PHY (Ethernet signal) MAC (MAC layer processing) application level interface Ethernet connection Media Independent Interface (MII) Calinx has no MAC chip, must be handled in FPGA. Calinx has no MAC chip, must be handled in FPGA. You have met ethernet. IEEE 802.15.4 will look similar…yet different

36 10/12/2004EECS 150, Fa04, Lec 13-Project 36 802.15.4 Frames

37 10/12/2004EECS 150, Fa04, Lec 13-Project 37 Packet protocols Framing definitions –IEEE 802.15.4 Packet formats –Request game –Joystick packet –Board Packet

38 10/12/2004EECS 150, Fa04, Lec 13-Project 38 Schedule of checkpoints CP1: N64 interface (this week) CP2: Digital video encoder (week 8) CP3: SDRAM controller (two parts, week 9-10) CP4: IEEE 802.15.4 (cc2420) interface (wk 11-12) –unless we bail out to ethernet –Overlaps with midterm II Project CP: game engine (wk 13-14) Endgame –11/29 early checkoff –12/6 final checkoff –12/10 project report due –12/15 midterm III


Download ppt "EECS 150 - Components and Design Techniques for Digital Systems Lec 13 – Project Overview David Culler Electrical Engineering and Computer Sciences University."

Similar presentations


Ads by Google