Articles for subject Probability
❦
-
Communication through memory-less static channels will be examined, and in particular using repetition codes to counteract the errors introduced by the channel.
-
Conditional entropy.
-
The joint entropy and its properties
-
The relative entropy, or Kullback-Leibler divergence is a measure of the difference of two distributions
-
Mutual information.
-
We examine the problem of optimally encoding a set of symbols in some alphabet to reduce the average length of the code.
-
A definition of information is introduced which leads to yet another connection with entropy.
-
A factorisation of the joint probability distribution can be represented in a graph known as a Bayesian Network. These networks codify independencies between variables and information flow as variables are measured.
-
When we have several variables, the joint probability distribution over all the variables has all the information we need to calculate any simpler probability distribution. The joint probability distribution can be expressed efficiently if there are independence relationships between variables.
-
What is meant by “uncertainty”?