Welcome to Scholar. This is an experiment in putting lecture notes online. It's a work in progress and material is being added and improved all the time.

News: posts on Scholar


Threads weave a path through the collection of articles. They correspond to a course or part of a course.

Recently updated articles

3 Information Theory   months ago

Communication through noisy channels

Communication through memory-less static channels will be examined, and in particular using repetition codes to counteract the errors introduced by the channel.

2 Information Theory   months ago

Mutual Information

2 Information Theory   months ago

Relative Entropy

The relative entropy, or Kullback-Leibler divergence is a measure of the difference of two distributions

2 Information Theory   months ago

Conditional Entropy

2 Information Theory   months ago

Joint Entropy

The joint entropy and its properties

2 Convexity   months ago


Quick introduction to convex sets, convex functions and Jensen's inequality

3 Information Theory   months ago

Data Compression

We examine the problem of optimally encoding a set of symbols in some alphabet to reduce the average length of the code.

2 Information Theory Probability   months ago


A definition of information is introduced which leads to yet another connection with entropy.

2 Bayesian Networks   months ago

Bayesian Networks

A factorisation of the joint probability distribution can be represented in a graph known as a Bayesian Network. These networks codify independencies between variables and information flow as variables are measured.

2 Bayesian Networks Probability   some time ago

Variable Independence

When we have several variables, the joint probability distribution over all the variables has all the information we need to calculate any simpler probability distribution. The joint probability distribution can be expressed efficiently if there are independence relationships between variables.