\[f(\mathbf{x}; \mathbf{\Psi}) = \sum_{k=1}^{G}{\pi_k \phi (\mathbf{x}; \mathbf{\mu}_k; \mathbf{\Sigma}_k)}\]
\(\phi(\cdot)\) Multivariate Gaussian density
Maximum Likelihood Estimation via Expectation Maximization (EM) algorithm
Likelihood for Gaussian Mixture Models (GMMs):
\[\begin{align}
\ell(\Psi) = \sum_{i=1}^n \log \left\{ \sum_{k=1}^G \pi_k \phi(x_i; \mu_k, \Sigma_k) \right\}
\end{align}\]
We Re-Formulate this likelihood to a complete-data likelihood to utilize the EM algorithm
\[\begin{align}
\ell_{\mathcal{C}}(\Psi) = \sum_{i=1}^n \sum_{k=1}^G z_{ik} \left\{ \log \pi_k + \log \phi(x_i; \mu_k, \Sigma_k) \right\}
\end{align}\]
\[\begin{align}
z_{ik} =
\begin{cases}
1 & \text{if } x_i \text{ belongs to component }k \\
0 & \text{otherwise.}
\end{cases}
\end{align}\]
E-Step:
\[\begin{align}
\hat{z}_{ik} = \frac{\hat{\pi}_k \phi(x_i; \hat{\mu}_k, \hat{\Sigma}_k)}{\sum_{g=1}^{G} \hat{\pi}_g \phi(x_i; \hat{\mu}_g, \hat{\Sigma}_g)},
\end{align}\]
M-Step:
\[\begin{align}
\quad \hat{\mu}_k = \frac{\sum_{i=1}^{n} \hat{z}_{ik} x_i}{n_k}, \quad \text{where} \quad n_k = \sum_{i=1}^{n} \hat{z}_{ik}.
\end{align}\]