Blog - Bayesian Learning

\[P(\theta \vert D) = \frac{P(\theta, D)}{P(\theta)} = \frac{P(D\vert \theta)P(D)}{\sum_{\theta'}P(D\vert \theta =\theta')P(D)}\]

Prior: $P(\theta)$.
Likelihood: $P(D \vert \theta)$
Posterior: $P(\theta \vert D)$.
Marginal: $P(D\vert \theta =\theta’)P(D)$
Joint Distribution: $P(\theta, D)$

Citations