Articles for subject Information Theory
❦

Communication through memoryless static channels will be examined, and in particular using repetition codes to counteract the errors introduced by the channel.

Conditional entropy.

The joint entropy and its properties

The relative entropy, or KullbackLeibler divergence is a measure of the difference of two distributions

Mutual information.

We examine the problem of optimally encoding a set of symbols in some alphabet to reduce the average length of the code.

A definition of information is introduced which leads to yet another connection with entropy.