Presentation is loading. Please wait.

Presentation is loading. Please wait.

Student: Omar Jarushi Instructor: Dr. Asaf Cohen 1.

Similar presentations


Presentation on theme: "Student: Omar Jarushi Instructor: Dr. Asaf Cohen 1."— Presentation transcript:

1 Student: Omar Jarushi Instructor: Dr. Asaf Cohen 1

2 O VERVIEW :  Introduction  Theoretical background  Random binning simulation  A simple, one source, three sinks network  A Gossip algorithm with correlated data 1. Design 2. Simulations 2

3 B ACKGROUND A ND M OTIVATION To date, most algorithms for information dissemination assume the data available at the nodes is independent. In case correlation exists, it is usually ignored. However, new algorithms for distributed source coding suggest that taking into account the correlation in data can result in significantly lower rates. Cost: A moderate price in decoding complexity. 3

4 A BOUT O UR W ORK Explore the applicability of distributed source coding techniques to existing data dissemination protocols, (The case study we will check: Gossip with correlated data). Implement network coding instead of store-and- forward mechanisms, which can provide substantial benefits in distributed networks. o Design and analyze a current communication protocol for data dissemination under the assumption of correlated data, using encoding and decoding at the relevant nodes. P ROJECT G OAL 4

5 P OTENTIAL A PPLICATIONS 1. Ecology and environmental monitoring: (For example): A weather measurement system, based on a wireless sensor network. 5

6 2. Content Distribution: (For example): Bit-Torrent in a (P2P) network, (simultaneously disseminate different fragments of a file among peers). Initial problems: > Prob. of acquiring a novel fragment decreases with the number of those already collected. > As the number of peers increases, it becomes harder to do optimal scheduling of distributing fragments to receivers. Solution: The Microsoft Secure Content Distribution (MSCD) / (Also known as Avalanche). It is an example of a P2P system attempting to alleviate such problems using network coding. 6 P OTENTIAL A PPLICATIONS

7 D ISTRIBUTED S OURCE C ODING - (E NCODING OF CORRELATED SOURCES ) In order to encode a source X, a rate R > H(X) is sufficient. (By the AEP: data compression theorem). Suppose we have two sources (X,Y)~p(x, y). If we are encoding them together, a rate H(X,Y) is sufficient. But, if X & Y must be described separately for some user, a rate R=Rx +Ry > H(X) + H(Y) is sufficient, (but over exaggerated). Slepian & Wolf show that a total rate R=H(X,Y) is sufficient, even for separate encoding. 7

8 S LEPIAN -W OLF T HEOREM Consider the following figure for the distributed source coding problem: (X, Y) are drown i.i.d ~ p(x, y). 8  Theorem: The achievable rate region is given by R1 >= H(X|Y), R2 >= H(Y|X), R1 + R2 >= H(X,Y). Encoder Decoder Rate R1 Rate R2 X Y

9 N ETWORK C ODING In routing: we do not code data content in the network. Consider the following figure: 9 X Y X or Y o By network coding: we can code the data content. X Y X + Y o In order to achieve bandwidth optimality, coding needs to be employed at the intermediate nodes.

10 Consider the network in the following figure: 10 N ETWORK C ODING (B UTTERFLY NETWORK ): With Network coding S 1 2 3 4 t1t2 encoded into the bit mod2 With Routing S 1 2 3 4 t1t2

11 G OSSIP ALGORITHMS : Extremely simple. Distributed Approach. Robust against network dynamics. Efficient in resource utilization. o Used in: networks in which lacks knowledge of the topology in nodes, where it is likely to: > Exhibit unpredictable dynamics. > face stringent resource constraints. Asynchronous information exchange protocol. 11

12 G OSSIP A LGORITHMS (A N EXAMPLE ): 12 o Assume the following simple network: 1 3 2 o Assume: nodes in the network have only local information. o Communication between nodes proceed in time steps called rounds.

13 B INNING WITH M ULTIPLE D ECODERS 13  Required quantity of information to each target depends on correlation level between the source sequence and the specific target side information sequence.  Hence: the description done by nested random binning.  To prevent redundancy, encoder builds one binning scheme for use with all the targets. ( binning for using bins).  I would also explain before, what is random binning at the first place (when we have just one sink).

14 Send a set of combinations of “binary result” equations, instead of sending the whole index of the bin as in traditional Slepian-Wolf encoding. “Binary result” equation: 14 B INNING WITH M ULTIPLE D ECODERS (C ONT.)  Suppose we are interested in transmitting the source sequence towards 2 targets we have.  Suppose also: first target needs 4 bins, and the second target needs 8 bins). Consider the scheme bellow:  An alternative encoding method:

15 R ANDOM B INNING AND I NDEPENDENT E QUATIONS : The figure bellow represents binning to 8 bins and its binary representation: 15 o Note that: the bin we received is not the one ‘intended’ to send, but: It has the correct size. It contains the true.

16 R ANDOM B INNING S IMULATION There exists 1 source & 1 destination. o Uniform distributed source, (i.e. p(X[n]=0)=p(X[n]=1)=0.5). o Build a random binning scheme at the source and reveal it to the destination. o Encoding: choose at random a typical sequence X^n, and transmits its bin index. 16 within this specific bin. (If there are more than one possible, declare an error). o Decoding: have a side information sequence Y^n. receives bin index and looks for a typical set

17 Construct different bins. Inspect the probability of error in decoding, while the value of changes. 17 R ANDOM B INNING S IMULATION o N- Sequence Length. o f - BSC error probability, (a BSC channel between X^n as an input and Y^n as an output). Prob. Of error in decoding 0.14

18 1 SOURCE 3 SINKS SIMULATION A scheme which describes an initial stage in the information dissemination process within a discussed network. Consider the following scheme: 18 o The source creates a random “binary result” equations based on the longest bin index (i.e. the index with length: ), chooses a random neighbor and sends him a linear combination on the bin index he holds. And so on… o A sink collects an independent equations, until owning its requested quantity, (sink(i) needs independent equations).

19 The following 2 figures show how many rounds takes to every sink to receive the whole set of independent needed equations, while the later value is joined -the low columns-, while proceeding in source sequence length N. 19 1 SOURCE 3 SINKS SIMULATION o The following figure shows the case where X[n], differs from Yi[n] in probabilities: 0.01, 0.02 and 0.03 respectively.

20 o The following figure shows the case where X[n], differs from Yi[n] in probabilities: 0.1, 0.2 and 0.3 respectively. 20 1 SOURCE 3 SINKS SIMULATION

21 A DDITIONAL S IMULATION N ETWORKS 21 Tree Network Random Net. (60 nodes) Random Network


Download ppt "Student: Omar Jarushi Instructor: Dr. Asaf Cohen 1."

Similar presentations


Ads by Google