Welcome to *Scholar*. This is an experiment in putting lecture notes online. It's a work in progress and material is being added and improved all the time.

News: posts on Scholar

Threads weave a path through the collection of articles. They correspond to a course or part of a course.

3
Probability
22 articles

Exploration of the role of probability in various areas of physics

3
Quantum
0 articles

A course on quantum theory, concentrating on the structure of quantum mechanics as a basis of physical theory rather than the quantum physics of particular systems. We take a Hilbert space approach.

3
Classical
0 articles

Exploration of the role of symmetry in physics. Covering topics from the principle of stationary action, Lagrangians, Hamiltonians and Noether's theorem.

2
Classical
12 articles

The physics and mathematics of rotating systems.

2
Classical
11 articles

The classical theory for oscillations and waves at a second year university level.

3
Information Theory
months ago

Communication through memory-less static channels will be examined, and in particular using repetition codes to counteract the errors introduced by the channel.

2
Information Theory
months ago

2
Information Theory
months ago

The relative entropy, or Kullback-Leibler divergence is a measure of the difference of two distributions

2
Information Theory
months ago

2
Information Theory
months ago

The joint entropy and its properties

2
Convexity
months ago

Quick introduction to convex sets, convex functions and Jensen's inequality

3
Information Theory
months ago

We examine the problem of optimally encoding a set of symbols in some alphabet to reduce the average length of the code.

2
Information Theory
Probability
months ago

A definition of information is introduced which leads to yet another connection with entropy.

2
Bayesian Networks
months ago

A factorisation of the joint probability distribution can be represented in a graph known as a Bayesian Network. These networks codify independencies between variables and information flow as variables are measured.

2
Bayesian Networks
Probability
some time ago

When we have several variables, the joint probability distribution over all the variables has all the information we need to calculate any simpler probability distribution. The joint probability distribution can be expressed efficiently if there are independence relationships between variables.