Starting from:
$30

$24

EE 477 HW6 - Introduction to Information Theory



1. Consider the probability distribution p(x; y) for x; y 2 f0; 1g:
    • 9
p(x; y) =
>
1=6;
x = 0; y = 0;
>
(1)


0;
x = 1; y = 0;



>
1=6;
x = 0; y = 1;
>


<


=








>
2=3;
x = 1; y = 1
>


>


>


:


;

Find

        (a) H(X), H(Y ),

        (b) H(XjY ), H(Y jX),

        (c) H(X; Y ),

        (d) I(X; Y ).

    2. Show that the following expressions hold. X; Y; Z are jointly distributed random variables.

        (a) H(X; Y jZ)   H(XjZ),

        (b) I(X; Y ; Z)   I(X; Z),

(c) H(X; Y; Z)    H(X; Y )    H(X; Z)    H(X),

(d) I(X; ZjY )    I(Z; Y jX)    I(Z; Y ) + I(X; Z).

3. Assume that we have four symbols, namely a; b; c; d.    These symbols are mapped to codewords of

10; 00; 11; 110, respectively.

    (a) Does the given code have the pre x property?

    (b) Is this a uniquely decodable code? Prove it.














1


    4. The probability distributions p(x) and q(x) are de ned on the same random variable X with 5 outcomes, f1; 2; 3; 4; 5g. C1(x) and C2(x) are two di erent codes on X:

Symbol
p(x)
q(x)
C1(x)
C2(x)
1
1/2
1/2
0
0





2
1/4
1/8
10
100





3
1/8
1/8
110
101





4
1/16
1/8
1110
110





5
1/16
1/8
1111
111






        (a) Find H(p); H(q); D(pjjq) and D(qjjp).

        (b) Prove the optimality of C1 and C2 on X under p and q, respectively.

        (c) We decide to use C2 when the distribution is p. Find the average codeword length. Find its di erence to the entropy of p.

        (d) We decide to use C1 when the distribution is q. Find the average codeword length. How much do we lose due to miscoding?

    5. For each of the following codes, nd a probability assignment that results in Hu man coding. If there is no such a probability assignment, explain why.

        (a) f0,10,11g

        (b) f00,01,10,110g

        (c) f01,10g.

    6. Consider the probability distribution on X with 8 outcomes:

p(X = 1) = p(X = 2) = 2p(X = 3) = 2p(X = 4) = 4p(X = 5) = 4(X = 6) = 4p(X = 7) = 4p(X = 8).

        (a) Give the Hu man binary code.

        (b) Find the expected code length of a) and H(X).

        (c) Find the Hu man 3-ary code.

R
7. For each of the following,    nd the di erential entropy, h(X) =    f ln f.

        (a) f(x) =  e  x for nonnegative x.

        (b) f(x) = 1=2 e  jxj for real x.

        (c) f(x) where, X = X1 + X2 where X1 and X2 are independent with f(xi) = N( i; i2), i = 1; 2.

    8. Consider the channel transition matrix, Q:

Q =

1=2
1=2

(2)


1
0


where Q(i; j) gives the probability of receiving yj
if xi
is transmitted.
Assume x1  = y1  = 0 and
x2 = y2 = 1.






    (a) Draw the channel model with the transmitted bits on the left side, the received bits on the right side and the transition probabilities connecting them. What is the name of this channel?

    (b) Find the capacity of this channel.

    (c) Find the input distribution that achieves the capacity.



2

    9. Find the capacity of following channels using the given probability transition matrices.

        (a) Outcomes: f0; 1; 2g for both the input and output distributions,
2    3
1=3    1=3    1=3
p(yjx) = 41=3    1=3    1=35

1=3    1=3    1=3

(b) Outcomes: f0; 1; 2g for both the input and output distributions,
2    3
1=2    1=2    0
p(yjx) = 4 0    1=2    1=25

1=2    0    1=2

(c) Outcomes: f0; 1; 2; 3g for both the input and output distributions,



2
p
1  p
0
0

3


6





7
p(y
x) =

1  p
p
0
0


j


0
0
1
q   q




6
0
0
q
1
q
7


4





5
















(3)





(4)






(5)
(d) Outcomes: f0; 1; 2g for both the input and output distributions,
p(y x) =
21
p
1
p  0
3
(6)

4
p
p
0
5

j

0
0
1



    10. Consider a cascaded BSC (x ! y ! z) with two identical BSCs with the following probability transition matrices:
    11. 
1    p    p
p(zjy) = p(yjx) =    (7)

Find the channel capacity of the cascaded BSC.

Report and Submission

You can prepare your report with any text editor or in handwriting. Upload a single pdf (not a .zip le) on MOODLE. Late submission is penalized with 10% per day.
3

More products