Worked out example of Slepian-Wolf Theorem


Note: First posted this on Theoretical Computer Science Stack Exchange, but deleted it from there since it seems to be off-topic.

The Slepian-Wolf theorem states that sequences of outputs from two separate random variable sources A and B that have a joint probability distribution defined on them, if encoded with the following rates, can be completely retrieved when decoded together:

$ $ R_A \geq H(A|B) \ R_B \geq H(B|A) \ R_A + R_B \geq H(A,B) $ $

$ R_x$ refers to the bits required for encoding one symbol of $ X$ , assuming all logarithms are taken to the base 2.

Given this, I wanted to try out an example, especially because I find fractional number of bits per symbol slightly confusing to think about.

Consider two sources $ A$ and $ B$ , that either sprout out 0 or 1 following this probability distribution:

A \ B 0 1
0 0.5 0
1 0.25 0.25

We calculate the entropies as follows: $ $ H(A,B) = 3/2 = 15/10\ H(A|B) = 7/10 \ H(B|A) = 1/2 = 5/10\ $ $

Now, assume that the a certain sequence of bits that A and B give out are as follows:

A B
0 0
0 0
1 0
1 1
1 0
0 0
0 0
1 1
1 0
0 0

I should be able to find an encoding that allows A to send atleast 7 bits, B atleast 5 and a total of atleast 15, such that they can be decoded completely, right?

Unfortunately I am unable to think of an encoding where they send less than 10 bits each.

For example, B does not have to send anything when A sends 0, however B does not know when A sends 0.

I would also like to know if this is the wrong way to interpret the theorem (perhaps a longer sequence is required), or if there is another way to see its working.