Information Theory   some time ago

Communication through noisy channels

Communication through memory-less static channels will be examined, and in particular using repetition codes to counteract the errors introduced by the channel.

Information Theory   some time ago

Mutual Information

Information Theory   some time ago

Relative Entropy

The relative entropy, or Kullback-Leibler divergence is a measure of the difference of two distributions

Information Theory   some time ago

Conditional Entropy

Information Theory   some time ago

Joint Entropy

The joint entropy and its properties

Information Theory   some time ago

Data Compression

We examine the problem of optimally encoding a set of symbols in some alphabet to reduce the average length of the code.

Information Theory Probability   some time ago

Information

A definition of information is introduced which leads to yet another connection with entropy.

Bayesian Networks   some time ago

Bayesian Networks

A factorisation of the joint probability distribution can be represented in a graph known as a Bayesian Network. These networks codify independencies between variables and information flow as variables are measured.

Bayesian Networks Probability   some time ago

Variable Independence

When we have several variables, the joint probability distribution over all the variables has all the information we need to calculate any simpler probability distribution. The joint probability distribution can be expressed efficiently if there are independence relationships between variables.

Probability Maximum Entropy   some time ago

Understanding Uncertainty