Skip to content

Commit

Permalink
Merge sharelatex-2019-01-08-1243 into master
Browse files Browse the repository at this point in the history
  • Loading branch information
kumar-shridhar committed Jan 8, 2019
2 parents 4253bfc + 3ee0027 commit 99f3f92
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion Chapter2/chapter2.tex
Original file line number Diff line number Diff line change
Expand Up @@ -127,7 +127,7 @@ \subsection{Variational Inference}

Minimising the Kullback--Leibler divergence is equivalent to maximising the \textit{log evidence lower bound},
\begin{align}
KL_{\text{VI}} := \int q(w) p(F | X, w) \log p(Y | F) dF dw - \KL(q(w) || p(w)) \label{eq:L:VI}
KL_{\text{VI}} := \int q(w) p(F | X, w) \log p(Y | F) dF dw - KL(q(w) || p(w)) \label{eq:L:VI}
\end{align}
with respect to the variational parameters defining $q(w)$. This is known as \textit{variational inference}, a standard technique in Bayesian modelling.

Expand Down

0 comments on commit 99f3f92

Please sign in to comment.