Exploration of the role of probability in various areas of physics
Level: 3, Subjects: Probability

Probability permeates much of physics. It appears in quantifying errors in every measurement, in the dynamics of stochastic processes, in statistical mechanics as a way of coping with the vast amount of variables, and even intrinsically in quantum mechanics. At first glance the use of probability may seem natural and even `obvious', but things get much more lively when you realise that it is still not settled what a probability is. Different interpretations of probability affect the meaning of all the areas that it touches.

This material supports a second and third year advanced physics unit at Macquarie University.

In these notes I will take the view that probabilities are a measure of plausibility and probability theory is the extension of deductive logic to incomplete information. This view follows Laplace, Jeffreys, Cox, and Jaynes.

Status: In development. These notes are very much a work in progress and an exploration of the topic. They may change radically in the future. Lecture notes are available as mindmaps here: /map/probably-physics/.

Curriculum:

2 Probability Inference   some time ago

1 Marginalization

Variables are introduced and then some consequences of the sum-rule are explored.

3 Probability Inference   some time ago

2 Bayes' rule

Some consequences of the product rule are explored including the famous Bayes' rule.

2 Probability   some time ago

3 Assigning numbers

How do you assign actual values to probabilities?

3 Probability   some time ago

4 Probability and Frequency

Given that we are interpreting probability as a measure of plausibility, just what is the relationship of probabilities to frequencies?

3 Probability Inference   some time ago

5 Comparing Models

Model comparison is one of the principal tasks on inference. Given some data, how does the plausibility of different models change? Does the data select out a particular model as being better?

2 Probability   some time ago

6 Probability Notation

making statements about whole families of logical propositions

2 Probability Inference   some time ago

7 Parameter Estimation

Another of the key tasks of inference is to determine the value of a parameter in a model on the basis of observed data

2 Probability Inference   some time ago

8 Priors

A closer look at prior distributions

2 Probability Maximum Entropy   some time ago

9 Maximum Entropy

An argument to maximising the entropy

2 Probability Maximum Entropy   some time ago

10 Understanding Uncertainty

3 Maximum Entropy   some time ago

11 Maximum entropy with average constraints

Find the probability distribution that maximises the entropy subject to requiring some averages to be fixed.

2 Maximum Entropy   some time ago

12 Statistical Mechanics

2 Bayesian Networks Probability   some time ago

13 Variable Independence

When we have several variables, the joint probability distribution over all the variables has all the information we need to calculate any simpler probability distribution. The joint probability distribution can be expressed efficiently if there are independence relationships between variables.

2 Bayesian Networks   some time ago

14 Bayesian Networks

A factorisation of the joint probability distribution can be represented in a graph known as a Bayesian Network. These networks codify independencies between variables and information flow as variables are measured.

2 Information Theory Probability   some time ago

15 Information

A definition of information is introduced which leads to yet another connection with entropy.

3 Information Theory   some time ago

16 Data Compression

We examine the problem of optimally encoding a set of symbols in some alphabet to reduce the average length of the code.

2 Convexity   some time ago

17 Convexity

Quick introduction to convex sets, convex functions and Jensen's inequality

2 Information Theory   some time ago

18 Joint Entropy

The joint entropy and its properties

2 Information Theory   some time ago

19 Conditional Entropy

2 Information Theory   some time ago

20 Relative Entropy

The relative entropy, or Kullback-Leibler divergence is a measure of the difference of two distributions

2 Information Theory   some time ago

21 Mutual Information

3 Information Theory   some time ago

22 Communication through noisy channels

Communication through memory-less static channels will be examined, and in particular using repetition codes to counteract the errors introduced by the channel.