Articles for subject Physics
❦
-
Mutual information.
-
We examine the problem of optimally encoding a set of symbols in some alphabet to reduce the average length of the code.
-
A definition of information is introduced which leads to yet another connection with entropy.
-
A factorisation of the joint probability distribution can be represented in a graph known as a Bayesian Network. These networks codify independencies between variables and information flow as variables are measured.
-
When we have several variables, the joint probability distribution over all the variables has all the information we need to calculate any simpler probability distribution. The joint probability distribution can be expressed efficiently if there are independence relationships between variables.
-
What is meant by “uncertainty”?
-
An argument to maximising the entropy
-
Maximum entropy leads to some of the famous results of statistical mechanics.
-
making statements about whole families of logical propositions
-
A closer look at prior distributions