Articles for subject Information Theory
❦
-
Communication through memory-less static channels will be examined, and in particular using repetition codes to counteract the errors introduced by the channel.
-
Conditional entropy.
-
The joint entropy and its properties
-
The relative entropy, or Kullback-Leibler divergence is a measure of the difference of two distributions
-
Mutual information.
-
We examine the problem of optimally encoding a set of symbols in some alphabet to reduce the average length of the code.
-
A definition of information is introduced which leads to yet another connection with entropy.