Joint, Marginal, Conditional Distribution
Definition: Joint Probability Distribution (Discrete Case)
The joint distribution describes the probability of observing a particular combination of values for multiple random variables simultaneously.
Definition: Marginal Probability
The marginal distribution describes the probability distribution of a single variable in isolation, ignoring the values of the other variables.
Given the joint probability distribution we can easily calculate the marginal by using the sum rule:
Definition: Conditional Probability Distribution
Conditional distributions are a way of describing the probability distribution of one variable, given that the value of another variable is known. In other words, a conditional distribution provides information on how the probability distribution of one variable changes in response to changes in another variable.
Given the conditional probability, we can easily rewrite an equation for the joint probability. The result of this is the product rule:
Given this expression of the joint probability, we can rewrite the sum rule:
We know have the tools at hand to proof Bayes' Theorem:
Bayes' theorem is a fundamental concept in probability theory and has wide-ranging applications across many different fields. Its importance lies in its ability to help us make more accurate predictions and decisions by incorporating both prior knowledge and new data. It allows us to update our beliefs about an event as we receive new information.
Keep in mind, that the prior information is important and can obviously have negative side effects if your prior beliefs about X are wrong.