Blog - Bayesian Learning

\[P(\theta \vert D) = \frac{P(\theta, D)}{P(D)} = \frac{P(D\vert \theta)P(\theta)}{\sum_{\theta'}P(D\vert \theta =\theta')P(\theta=\theta')}\]

Prior: $P(\theta)$.
Likelihood: $P(D \vert \theta)$
Posterior: $P(\theta \vert D)$.
Marginal: $P(D\vert \theta =\theta’)P(D)$
Joint Distribution: $P(\theta, D)$

Citations