Normalizing the Gaussian Integral: The Multivariate Case

The goal here is to compute the integral $$ Z = \int_{-\infty}^{\infty}e^{-\frac{1}{2}(\vec{x} - \vec{\mu})^T\Sigma^{-1}(\vec{x} - \vec{\mu})}d\vec{x} $$ where \(\vec{\mu}=\left(E\left[X_1\right],\,E\left[X_2\right],\,\dots\,,\,E\left[X_d\right]\right)\) is the vector of means and \(\Sigma\) is the covariance matrix $$ \begin{bmatrix} \text{Var}[X_1] & \text{Cov}[X_1, X_2] & \dots & \text{Cov}[X_1, X_d] \\ \text{Cov}[X_2, X_1] & \text{Var}[X_2] & \dots & \text{Cov}[X_2, X_d] \\ \vdots & \vdots & \ddots & \vdots \\ \text{Cov}[X_d, X_1] & \dots & \dots & \text{Var}[X_d] \end{bmatrix} $$

Since the covariance matrix is a real symmetric matrix, we have its eigendecomposition $$ \Sigma = Q\Lambda Q^T $$ where \(\Lambda\) is a diagonal matrix with real-valued eigenvalues \(\lambda_k\) of \(\Sigma\) and \(Q\) is a matrix whose column vectors are the eigenvectors of \(\Sigma\).

Since \(\Sigma\) is positive-definite, we also have the inequality $$ \lambda_k > 0 $$ for all eigenvalues \(\lambda_k\).

We will make use of this outer product expansion of \(\Sigma\): $$ \Sigma = \sum_{i=1}^n\lambda_i q_i q_i^T, $$ where \(q_i\) is the \(i\)-th column vector of \(Q\). This expansion follows directly from the orthogonality of the columns of \(Q\): \(q_i^T q_j = 1\) only if \(i = j\), otherwise \(q_i^T q_j = 0\). This means that $$ Q^TQ = I $$ so \(Q^{-1} = Q^T\).

We now have a very nice formula for the inverse: $$ \begin{equation} \begin{aligned} \Sigma^{-1} &= (Q\Lambda Q^T)^{-1} \\ &= (Q^T)^{-1}\Lambda^{-1}Q^{-1} \\ &= Q\Lambda^{-1}Q^T, \end{aligned} \end{equation} $$ where $$ \Lambda^{-1} = \begin{bmatrix} \lambda_1^{-1} & 0 & \dots & 0 \\ 0 & \lambda_2^{-1} & \dots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & \dots & \dots & \lambda_n^{-1} \end{bmatrix} $$

This immediately gives us the inverse expansion $$ \Sigma^{-1} = \sum_{i=1}^n\lambda_i^{-1} q_i q_i^T. $$

Let's return to the exponent in the integrand, $$ \begin{equation} \begin{aligned} (x - \mu)^T\Sigma^{-1}(x - \mu) &= (x-\mu)^T\left[\sum_{i=1}^n\lambda_i^{-1}q_iq_i^T\right](x-\mu)\\ &= \sum_{i=1}^n(x-\mu)^T\lambda_i^{-1}q_iq_i^T(x-\mu)\\ &= \sum_{i=1}^n\frac{y_i^2}{2\lambda_i} \end{aligned} \end{equation} $$ where \(y = Q^T(x-\mu)\).

The coordinate transformation \(y=Q^T(x-\mu)\) has inverse \(x=\mu + Qy\). The determinant of the Jacobian of this transformation is \(\det Q^T = \det Q = 1\).

Now we can see this integral as an n-fold computation of single-variable Gaussian integrals: $$ \begin{equation} \begin{aligned} Z &= \int_{-\infty}^{\infty}e^{-\frac{1}{2}(x - \mu)^T\Sigma^{-1}(x - \mu)}dx \\ &= \int_{-\infty}^{\infty}\dots\int_{-\infty}^{\infty}\exp({-\sum_{i=1}^n \frac{1}{2\lambda_i}y_i^2})dy_1\dots dy_n \\ &= \prod_{i=1}^n\int_{-\infty}^{\infty}\exp({-\frac{1}{2\lambda_i}y_i^2})dy_i \\ &= \sqrt{\prod_{i=1}^n2\pi\lambda_i} = \left[(2\pi)^n\det\Sigma\right]^{1/2} \end{aligned} \end{equation} $$

All of this to say, $$ \frac{1}{ \sqrt{(2\pi)^n\det\Sigma}} \int_{-\infty}^{\infty}e^{-\frac{1}{2}(x - \mu)^T\Sigma^{-1}(x - \mu)}dx = 1 $$

$$\square$$