Tony T. Lee Department of Information Engineering

Slides:



Advertisements
Similar presentations
The Transmission-Switching Duality of Communication Networks
Advertisements

Lecture 2: Basic Information Theory TSBK01 Image Coding and Data Compression Jörgen Ahlberg Div. of Sensor Technology Swedish Defence Research Agency (FOI)
Applied Algorithmics - week7
1 IK1500 Communication Systems IK1330 Lecture 3: Networking Anders Västberg
Sampling and Pulse Code Modulation
Information Theory EE322 Al-Sanie.
Bounds on Code Length Theorem: Let l ∗ 1, l ∗ 2,..., l ∗ m be optimal codeword lengths for a source distribution p and a D-ary alphabet, and let L ∗ be.
Cooperative Multiple Input Multiple Output Communication in Wireless Sensor Network: An Error Correcting Code approach using LDPC Code Goutham Kumar Kandukuri.
Chapter 4: Network Layer
Chapter 6 Information Theory
Fundamental limits in Information Theory Chapter 10 :
Guaranteed Smooth Scheduling in Packet Switches Isaac Keslassy (Stanford University), Murali Kodialam, T.V. Lakshman, Dimitri Stiliadis (Bell-Labs)
048866: Packet Switch Architectures Dr. Isaac Keslassy Electrical Engineering, Technion Scaling.
ECE 776 Information Theory Capacity of Fading Channels with Channel Side Information Andrea J. Goldsmith and Pravin P. Varaiya, Professor Name: Dr. Osvaldo.
Noise, Information Theory, and Entropy
Noise, Information Theory, and Entropy
Analysis of Iterative Decoding
©2003/04 Alessandro Bogliolo Background Information theory Probability theory Algorithms.
Entropy and some applications in image processing Neucimar J. Leite Institute of Computing
INFORMATION THEORY BYK.SWARAJA ASSOCIATE PROFESSOR MREC.
Load Balanced Birkhoff-von Neumann Switches
On the Optimal SINR in Random Access Networks with Spatial Re-Use Navid Ehsan and R. L. Cruz UCSD.
Channel Capacity.
Distributed computing using Projective Geometry: Decoding of Error correcting codes Nachiket Gajare, Hrishikesh Sharma and Prof. Sachin Patkar IIT Bombay.
COMMUNICATION NETWORK. NOISE CHARACTERISTICS OF A CHANNEL 1.
Prof. Amr Goneid, AUC1 Analysis & Design of Algorithms (CSCE 321) Prof. Amr Goneid Department of Computer Science, AUC Part 8. Greedy Algorithms.
Outline Transmitters (Chapters 3 and 4, Source Coding and Modulation) (week 1 and 2) Receivers (Chapter 5) (week 3 and 4) Received Signal Synchronization.
Coding Theory Efficient and Reliable Transfer of Information
CHAPTER 5 SIGNAL SPACE ANALYSIS
Part 1: Overview of Low Density Parity Check(LDPC) codes.
Low Density Parity Check codes
1 Lecture 7 System Models Attributes of a man-made system. Concerns in the design of a distributed system Communication channels Entropy and mutual information.
Digital Communications I: Modulation and Coding Course Term Catharina Logothetis Lecture 9.
Based on An Engineering Approach to Computer Networking/ Keshav
Channel Coding Theorem (The most famous in IT) Channel Capacity; Problem: finding the maximum number of distinguishable signals for n uses of a communication.
1 Chapter 7 Network Flow Slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved.
UNIT I. Entropy and Uncertainty Entropy is the irreducible complexity below which a signal cannot be compressed. Entropy is the irreducible complexity.
Institute for Experimental Mathematics Ellernstrasse Essen - Germany DATA COMMUNICATION introduction A.J. Han Vinck May 10, 2003.
UNIT –V INFORMATION THEORY EC6402 : Communication TheoryIV Semester - ECE Prepared by: S.P.SIVAGNANA SUBRAMANIAN, Assistant Professor, Dept. of ECE, Sri.
McGraw-Hill©The McGraw-Hill Companies, Inc., 2000 Muhammad Waseem Iqbal Lecture # 20 Data Communication.
Basic Concepts of Information Theory Entropy for Two-dimensional Discrete Finite Probability Schemes. Conditional Entropy. Communication Network. Noise.
Sistem Telekomunikasi, Sukiswo, ST, MT Sukiswo
Computer Communication & Networks
Routing and Switching Fabrics
Chapter 8 Switching Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display.
Introduction to Information theory
Computing and Compressive Sensing in Wireless Sensor Networks
Random walks on undirected graphs and a little bit about Markov Chains
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
Factor Graphs and the Sum-Product Algorithm
Switching Techniques In large networks there might be multiple paths linking sender and receiver. Information may be switched as it travels through various.
Static and Dynamic Networks
Chapter 3: Pulse Code Modulation
Chapter 6.
Analysis & Design of Algorithms (CSCE 321)
COT 5611 Operating Systems Design Principles Spring 2012
COT 5611 Operating Systems Design Principles Spring 2014
Subject Name: Information Theory Coding Subject Code: 10EC55
Switching Techniques In large networks there might be multiple paths linking sender and receiver. Information may be switched as it travels through various.
Switching Techniques.
High Performance Computing & Bioinformatics Part 2 Dr. Imad Mahgoub
Capacity of Ad Hoc Networks
Chapter 2 Switching.
Transmission Errors Error Detection and Correction
Routing and Switching Fabrics
Switching Chapter 2 Slides Prepared By: -
Packet Switching and Information Transmission
Switch Performance Analysis and Design Improvements
Design Principles of Scalable Switching Networks
Introduction and Overview
Presentation transcript:

Shannon’s Legacy: The Mathematical Parallels Between Packet Switching and Information Transmission Tony T. Lee Department of Information Engineering The Chinese University of Hong Kong December, 2009

Claude Shannon: ‘A mathematical theory of communication’ Bell System Technical Journal  1948

Reliable Communication Circuit switching network Reliable communication requires noise-tolerant transmission Packet switching network Reliable communication requires both noise-tolerant transmission and contention-tolerant switching

Quantization of Communication Systems Transmission—from analog channel to digital channel Sampling Theorem of Bandlimited Signal (Whittakev 1915; Nyquist, 1928; Kotelnikou, 1933; Shannon, 1948) Switching—from circuit switching to packet switching Doubly Stochastic Traffic Matrix Decomposition (Hall 1935; Birkhoff-von Neumann, 1946)

Comparison of Transmission and Switching Shannon’s general communication system Received signal Source Message Transmitter Signal Channel capacity C Receiver Destination Temporal information source: function f(t) of time t Noise source Clos network C(m,n,k) Source Destination Input module Central module Output module nxm kxk mxn o o n-1 n-1 Spatial information source: function f(i) of space i=0,1,…,N-1 N-n k-1 m-1 k-1 N-n N-1 N-1 Channel capacity = m Internal contention

Compare Apple with Orange 350mg Vitamin C 1.5g/100g Sugar 500mg Vitamin C 2.5g/100g Sugar

Overview Clos network Communication Channel Noisy channel capacity theorem Noisy channel coding theorem Error-correcting code Sampling theorem Noiseless channel Noiseless coding theorem Random routing Deflection routing Route assignment BvN decomposition Path switching Scheduling

Contents Duality of Noise and Contention Deflection Routing and Noisy Channel Coding Route Assignment and Error-Correcting Code Noiseless Channel Model of Clos Network Traffic Matrix Decomposition and Sampling theorem Scheduling and Noiseless Channel Coding

Duality of Transmission and Switching Transmission channel with noise Source information is a function of time, errors corrected by providing more signal space Noise is tamed by error correcting code Packet switching with contention Source information f(i) is a function of space, errors corrected by providing more time Contention is tamed by delay, buffering or deflection Connection request f(i)= j 0111 0001 Message=0101 0101 0100 1101 Delay due to buffering or deflection

Output Contention and Carried Load Nonblocking switch with uniformly distributed destination address ρ: offered load ρ’: carried load 1 1 N-1 N-1 The difference between offered load and carried load reflects the degree of contention

Proposition on Signal Power of Switch (V. Benes 63) The energy of connecting network is the number of calls in progress ( carried load ) The signal power Sp of an N×N crossbar switch is the number of packets carried by outputs, and noise power Np=N- Sp Pseudo Signal-to-Noise Ratio (PSNR)

Boltzmann Statistics a 1 1 b 2 2 3 c 3 4 4 5 d 5 6 6 7 7 n0 = 5 n1 = 2 a n0 = 5 1 3 4 6 7 1 1 b 2 2 n1 = 2 a 5 d 3 c 3 Micro State 4 4 n2 = 1 2 b,c 5 d 5 6 6 Output Ports: Particles 7 7 Packet: Energy Quantum energy level of outputs = number of packets destined for an output. ni = number of outputs with energy level packets are distinguishable, the total number of states is, N = n + n + L + n Number of Outputs 1 r

Boltzmann Statistics (cont’d) From Boltzmann Entropy Equation Maximizing the Entropy by Lagrange Multipliers Using Stirling’s Approximation for Factorials Taking the derivatives with respect to ni, yields S: Entropy W: Number of States C: Boltzman Constant

Boltzmann Statistics (cont’d) If offered load on each input is ρ, under uniform loading condition Probability that there are i packets destined for the output Carried load of output Poisson distribution

Contention as Pseudo Gaussian Noise Sum of i.i.d. random variables if output i is busy otherwise Signal power: Noise power: Sp and Np are Normal rv - Central limit theorem Signal power Sp is normally distributed Mean: E[Sp] = Nρ’, Variance: Var[Sp] = Nρ’(1- ρ’) Noise power Np is also normally distributed Mean: E[Np] = N(1-ρ’), Variance: Var[Np] = Nρ’(1- ρ’)

Clos Network C(m,n,k) D S Slepian-Duguid condition m≥n k x k n x m m x n n-1 n-1 D = nQ + R D is the destination address Q =⌊D/n⌋ --- output module in the output stage R = [D] n --- output link in the output module G is the central module Routing Tag (G,Q,R) G G I Q n-1 n-1 m-1 k-1 k-1 m-1 D nI S I I G Q G nQ k-1 k-1 n(I+1)-1 n-1 m-1 Q G R nQ+R m-1 n-1 (n+1)Q-1 n(k-1) n(k-1) m-1 k-1 k-1 G I G Q nk-1 nk-1 n-1 m-1 k-1 k-1 m-1 n-1 Input stage Middle stage Output stage Slepian-Duguid condition m≥n

Connection Matrix 1 1 1 2 2 2 1 2 1 2 Call requests 1 2 1 1 2 2 3 3 4 1 1 Call requests 2 2 3 3 1 1 1 4 4 5 5 6 6 2 2 2 7 7 8 8 1 2 1 2 1 2

Clos Network as a Noisy Channel Source state is a perfect matching Central modules are randomly assigned to input packets Offered load on each input link of central module Carried load on each output link of central module Pseudo signal-to-noise ratio (PSNR)

Noisy Channel Capacity Theorem Capacity of the additive white Gaussian noise channel The maximum date rate C that can be sent through a channel subject to Gaussian noise is C: Channel capacity in bits per second W: Bandwidth of the channel in hertz S/N: Signal-to-noise ratio

Tradeoff between Bandwidth and PSNR Circuit switching with nonblocking routing Packet switching with random routing

Clos Network Communication Channel Contention Routing Noise Coding

Contents Duality of Noise and Contention Deflection Routing and Noisy Channel Coding Route Assignment and Error-Correcting Code Noiseless Channel Model of Clos Network Traffic Matrix Decomposition and Sampling theorem Scheduling and Noiseless Channel Coding

Clos Network with Deflection Routing Route the packets in C(n,n,k) and C(k,k,n) alternately k-1 n-1 kxk nxn C(n, n, k) C(k, k, n) Encoding output port addresses in C(n, n, k) Destination: D = nQ1 + R1 Output module number: Output port number: Encoding output port addresses in C(k, k, n) Destination: D = kQ2 + R2 Routing Tag = (Q1,R1, Q2,R2)

Example of Deflection Routing output output 1 1 1 1 1 1 1 1 1 1 2 2 2 2 2 1 2 3 3 1 1 4 1 4 2 1 2 2 5 5 C(3,3,2) C(2,2,3) Routing coding in C(3,3,2) Destination Q1 R1 1 2 3 4 5 Routing coding in C(2,2,3) Destination Q2 R2 1 2 3 4 5

Analysis of Deflection Clos Network Markov chain of deflection routing in C(n,n,n) network Input Q R O q p Output p Probability of success q=1-p Probability of deflection

Loss Probability Versus Network Length -2 -4 1 0.8 0.6 0.4 0.2 0.0 -6 -8 10 20 30 40 50 60

Loss Probability versus Network Length The loss probability of deflection Clos network is an exponential function of network length

Shannon’s Noisy Channel Coding Theorem Given a noisy channel with information capacity C and information transmitted at rate R If R<C, there exists a coding technique which allows the probability of error at the receiver to be made arbitrarily small. If R>C, the probability of error at the receiver increases without bound.

Binary Symmetric Channel The Binary Symmetric Channel(BSC) with cross probability q=1-p‹½ has capacity There exist encoding E and decoding D functions If the rate R=k/n=C-δ for some δ>0. The error probability is bounded by If R=k/n=C+ δ for some δ>0, the error probability is unbounded 1 p q p

Parallels Between Noise and Contention Binary Symmetric Channel Deflection Clos Network Cross Probability q<½ Deflection Probability q<½ Random Coding Deflection Routing R≤C R≤n Exponential Error Probability Exponential Loss Probability Complexity Increases with Code Length n Complexity Increases with Network Length L Typical Set Decoding Equivalent Set of Outputs

Contents Duality of Noise and Contention Deflection Routing and Noisy Channel Coding Route Assignment and Error-Correcting Code Noiseless Channel Model of Clos Network Traffic Matrix Decomposition and Sampling theorem Scheduling and Noiseless Channel Coding

Hall’s Marriage Theorem Let G be a bipartite graph with input set VI, and edge set E. There exists a perfect matching f: VI → VO, if and only if for every subset A ⊂ VI, |NA| ≥ |A| where NA is the neighborhood of set A, NA = {b | (a,b) ∈ E, a∈A} ⊆ VO A For subset A, |A|=2 |NA| = 4

Edge Coloring of Bipartite Graph A Regular bipartite graph G with vertex-degree m satisfies Hall’s condition Let A ⊆ VI be a set of inputs, NA = {b | (a,b) ∈ E, a∈A} , since edges terminate on vertices in A must be terminated on NA at the other end.Then m|NA| ≥ m|A|, so |NA| ≥ |A|

Route Assignment in Clos Network 1 1 2 2 1 1 3 3 1 4 4 2 2 5 5 6 2 6 3 3 7 7 Computation of routing tag (G,Q,R) S=Input 0 1 2 3 4 5 6 7 D=Output 1 3 2 0 6 4 7 5 G=Central module 0 2 0 2 2 1 0 2 0 1 1 0 3 2 3 2 1 1 0 0 0 0 1 1

Rearrangeabe Clos Network and Channel Coding Theorem (Slepian-Duguid) Every Clos network with m≥n is rearrangeably nonblocking The bipartite graph with degree n can be edge colored by m colors if m≥n There is a route assignment for any permutation Shannon’s noisy channel coding theorem It is possible to transmit information without error up to a limit C.

Gallager Codes Low Density Parity Checking (Gallager 60) Bipartite Graph Representation (Tanner 81) Approaching Shannon Limit (Richardson 99) VL: n variables VR: m constraints x0 x1+x3+x4+x7=1 + 1 x1 Unsatisfied x2 x0+x1+x2+x5=0 + x3 Satisfied 1 x4 x2+x5+x6+x7=0 + 1 x5 Satisfied x6 Closed Under (+)2 x0+x3+x4+x6=1 + 1 x7 Unsatisfied

Expander Codes Expander Graph G(VL, VR, E) Distance(C(G))> αn VL: k-regular with |VL| = n VR: 2k-regular with |VR|=n/2 There exists α>0, such that for every S ⊂ VL Distance(C(G))> αn Decoding Algorithm (Sipser-Speilman 95) The Algorithm can remove up to (αn)/2 errors |S| < αn |NS| > (k/2)|S| If there is a vertex v∈ VL such that most of its neighbors (checks) are unsatisfied, flip the value of v. Repeat If for every S ⊂ VL

Benes Network 1 x1 x2 x3 + x4 + x5 + x6 + x7 + x8 + Bipartite graph of call requests 1 1 2 2 3 3 4 4 5 1 5 6 6 7 7 G(VL X VR, E) 8 8 x1 + x1 + x2 =1 x2 + x3 + x4 =1 Input Module Constraints x3 + x5 + x6 =1 x4 + x7 + x8 =1 x5 Not closed under + + x1 + x3 =1 x6 + x6 + x8 =1 Output Module Constraints x7 + x4 + x7 =1 x8 + x2 + x5 =1

Bipartite Matching and Route Assignments 1 1 2 2 Call requests 3 3 4 4 5 5 6 6 7 7 8 8 1 1 2 2 3 3 4 4 Bipartite Matching and Edge Coloring

Flip Algorithm Assign x1=0,x2=1,x3=0,x4=1…to satisfy all input module constraints initially Unsatisfied vertices divide each cycle into segments. Label them α and β alternately and flip values of all variables in α segments x1 x2 x1+x2=1 + + x1+x3=0 1 x3 x3+x4=1 + + x6+x8=0 x4 1 x5 x5+x6=1 + + x4+x7=1 x6 1 x7 x7+x8=1 + + x2+x5=1 x8 1 Input module constraints Output module constraints variables

Final Route Assignments Apply the algorithm iteratively to complete the route assignments 1 1 2 2 3 3 4 4 5 5 6 6 7 7 8 8

Contents Duality of Noise and Contention Deflection Routing and Noisy Channel Coding Route Assignment and Error-Correcting Code Noiseless Channel Model of Clos Network Traffic Matrix Decomposition and Sampling theorem Scheduling and Noiseless Channel Coding

Concept of Path Switching Traffic signal at cross-roads Use predetermined conflict-free states in cyclic manner The duration of each state in a cycle is determined by traffic loading Distributed control N Traffic loading: NS: 2ρ EW: ρ W E NS traffic EW traffic S Cycle

Path Switching of Clos Network 1 1 2 2 3 3 1 1 1 4 4 5 5 6 6 2 2 2 7 7 8 8 1 2 1 2 1 1 2 2 Time slot 1 Time slot 2

Capacity of Virtual Path Capacity equals average number of edges Time slot 0 Virtual path 1 1 2 2 G1 Time slot 1 G1 U G2 1 1 2 2 G2

Noiseless Virtual Path of Clos Network Input module (input queued switch) Central module (nonblocking switch) Output module (output queued Switch) k-1 m-1 kxk mxn nxm o o n-1 n-1 o o n-1 n-1 Input buffer Predetermined connection pattern in every time slot Output buffer λij Source Buffer and scheduler Input module i Input module j Buffer and scheduler Destination Virtual path Scheduling to combat channel noise Buffering to combat source noise

Contents Duality of Noise and Contention Deflection Routing and Noisy Channel Coding Route Assignment and Error-Correcting Code Noiseless Channel Model of Clos Network Traffic Matrix Decomposition and Sampling theorem Scheduling and Noiseless Channel Coding

Complexity Reduction of Permutation Space Subspace spanned by K base states {Pi} Convex hull of doubly stochastic matrix Reduce the complexity of permutation space from N! to K K ≤ min{F, N2-2N+2}, the base dimension of C K ≤ F ≤(BN)/m, if C is bandlimited cij ≤B/F K ≤ F can be a constant independent of N if round-off error of order 1/F is acceptable

Parallels Between Sampling Theorems Packet switching Digital transmission Network environment Time slotted switching system Time slotted transmission system Bandwidth limitation Capacity limited traffic matrix Bandwidth limited signal function Samples Complete matching, (0,1) Permutation matrixes Entropy, (0,1) Binary sequences Expansion Birkhoff decomposition (Hall’s matching theorem) Fourier series

Parallels Between Sampling Theorems Packet switching Digital transmission Inversion by weighted sum by samples Reconstruction the capacity by running sum Reconstruction the signal by interpolation Complexity reduction Reduce number of permutation from N! to O(N2). Reduce to O(N), if bandwidth is limited. Reduce to constant F if truncation error of order O( 1 / F ) is acceptable. Reduce infinite dimensional signal space to finite number 2tW in any duration t. QoS Buffering and scheduling, capacity guarantee, delay bound Pulse code modulation (PCM), error-correcting code, data compression, DSP

Contents Duality of Noise and Contention Deflection Routing and Noisy Channel Coding Route Assignment and Error-Correcting Code Noiseless Channel Model of Clos Network Traffic Matrix Decomposition and Sampling theorem Scheduling and Noiseless Channel Coding

Source Coding and Scheduling Source coding: A mapping from code book to source symbols to reduce redundancy Scheduling: A mapping from predetermined connection patterns to incoming packets to reduce delay jitter

Smoothness of Scheduling Scheduling of a set of permutation matrices generated by decomposition The sequence , ,……, of inter-state distance of state Pi within a period of F satisfies Smoothness of state Pi with frame size F Pi Pi Pi Pi Pi F

Entropy of Decomposition and Smoothness of Scheduling Any scheduling of capacity decomposition Entropy inequality (Kraft’s Inequality) The equality holds when

Smoothness of Scheduling A Special Case If K=F, Фi=1/F, and ni=1 for all i, then for all i=1,…,F Another Example Smoothness The Input Set The Expected Optimal Result P1 P2 P3 P4

Smoothness of optimal scheduling Smoothness of random scheduling Kullback-Leibler distance reaches maximum when Always possible to device a scheduling within 1/2 of entropy

Noiseless Coding Theorem Necessary and Sufficient condition to prefix encode values x1,x2,…,xN of X with respective length n1,n2,…nN Any prefix code that assigns ni bits to xi Always possible to device a prefix code within 1 of entropy (Kraft’s Inequality)

Weighted Fair Queueing (WFQ) Scheduling Algorithm The WFQ Scheduling Algorithm Select the state with the smallest finish time and increase its finish time by the inverse of its weight. Repeat this process until the frame size is reached P1 P2 P3 P4 P5 Selection 1 2 8 4 3 6 5 10 16 7 The Final Sequence Should be P1 P1 P1 P1 P2 P3 P4 P5

WF2Q Scheduling Algorithm WF2Q incorporates rate requirement of generalized processor sharing (GPS) in the WFQ. Let Ti(τ) be the number of time slots assigned to state Pi up to time τ, the WF2Q will select the state Pi that satisfies in every time slot τ = 1,2,... Same example P1 P2 P3 P4 P5 Selection 1 2 8 P1P2P3P4P5 4 P2P3P4P5 3 P1P3P4P5 16 P3P4P5 5 6 P1P4P5 P4P5 7 P1P5 10 The Final Sequence P1 P2 P1 P3 P1 P4 P1 P5

Huffman Round Robin (HuRR) Algorithm Step1 Initially set the root be temporary node Px, and S = Px…Px be temporary sequence. Step2 Apply the WFQ to the two successors of Px to produce a sequecne T, and substitute T for the subsequence Px…Px of S. Step3 If there is no intermediate node in the sequence S, then terminate the algorithm. Otherwise select an intermediate node Px appearing in S and go to step 2. 1 PZ 0.5 PX PY 0.25 0.25 P1 P2 P3 P4 P5 0.5 0.125 0.125 0.125 0.125 Huffman Code logarithm of interstate time = length of Huffman code

Comparison of Scheduling Algorithms Random WFQ WF2Q HuRR Entropy 0.1 0.7 1.628 1.575 1.414 1.357 0.2 0.6 1.894 1.734 1.626 1.604 1.571 0.3 0.5 2.040 1.784 1.724 1.702 1.686 2.123 1.882 1.801 1.772 1.761 0.4 2.086 1.787 1.745 1.722 2.229 1.903 1.884 1.847 2.312 2.011 1.980 1.933 1.922 2.286 1.908 1.896 2.370 2.016 1.971 Better Performance

Entropy of Capacity Matrix C=[cij] Entropy of input module i is Input entropies: Entropy of output module j is Output entropies: Entropy of capacity matrix C:

Entropy of Capacity Matrix (Cont’d) Given capacity matrix C = [cij] Input entropy H and output entropy H are Entropy of capacity matrix C is

Two-Dimensional Smoothness Inter-token times satisfies where nij is the number of tokens assigned to virtual path Vij within a frame F Smoothness of virtual path Vij

Two-Dimensional Smoothness (Cont’d) Smoothness of input module i Input smoothness: Smoothness of output module j Output smoothness: 2D-Smoothness of scheduling

Theorem: For any 2D-scheduling of the capacity matrix decomposition , we have Kraft’s matrix Kr =[2-dij] is doubly sub-stochastic for input module i=1,2,…,N for output module j = 1,2,…,N The above equalities hold when , for i, j = 1,2,…,N and k = 1,2,…,nij.

W F Q 2D-Smoothness of WFQ Decomposition of matrix C with frame size 8 WFQ scheduling P1 P2 P3 P4 P5 W F Q a,b,c,d are tokens assigned to input 1-4 Input Smoothness Output Smoothness

H u R R 2D-Smoothness of HuRR HuRR scheduling algorithm Improved smoothness with HuRR H u R R

Improving 2D-Smoothness Decomposition with less matrices HuRR scheduling P1 P2 P3 P4 H u R R Input Smoothness Improvement for input 3 Output Smoothness Improvement for output 4

Transmission-Switching Dual of Communication Permutation Matrix Clos Network Route Assignment Hall’s Matching Theorem (BvN Decomposition) Scheduling and Buffering Communication System Boltzmann Equation S = k logW Entropy Noisy Channel Channel Coding Bandlimited Sampling Theorem Source Coding

Conclusion-it is law of probability Input signal to transmission system is a function of time The main theorem on noisy channel coding is proved by law of large number Input signal to switching system is a function of space Both theorems on deflection routing and smoothness of scheduling are proved by randomness

Conclusion-it is law of probability Input signal to transmission system is a function of time The main theorem on noisy channel coding is proved by law of large number Input signal to switching system is a function of space Both theorems on deflection routing and smoothness of scheduling are proved by randomness

Transmission-Switching Dual of Communication Permutation Matrix Clos Network Route Assignment Hall’s Matching Theorem (BvN Decomposition) Scheduling and Buffering Communication System Boltzmann Equation S = k logW Entropy Noisy Channel Channel Coding Bandlimited Sampling Theorem Source Coding

The universe is built on a plan of profound symmetry of which is somehow present in the inner structure of our intellect. -Paul Valery (1871-1945)