Communication through memory-less static channels will be examined, and in particular using repetition codes to counteract the errors introduced by the channel.
The relative entropy, or Kullback-Leibler divergence is a measure of the difference of two distributions
The joint entropy and its properties
We examine the problem of optimally encoding a set of symbols in some alphabet to reduce the average length of the code.
A definition of information is introduced which leads to yet another connection with entropy.