so Silverman in his 1986 book mentioned about approximating distributions with Gaussian mixture models but he didn’t go much further into the topic…I’m just wondering, say I’m given a N-dimensional uniform box (and as a further extension to this, any arbitrary distribution) $ u(\bar{x})$ , is there a neat way to approximately it with a (truncated) k-kernel Gaussian mixture model $ g(\bar{x}) = \sum_{i} \omega_i N(\mu_i,\sigma^2_i)$ ?

I’ve attempted this by trying to minimise some sort of measurable divergence between the two, say the Hellinger distance or the Kullback–Leibler divergence, but analytical solutions don’t seem to be plausible and in terms of numerical approaches, the the computational costs climb up way too rapidly as the number of kernels/dimension goes up.

I wonder if any previous studies have been done on this topic before but haven’t been quite successful in finding anything useful… if someone can kindly give me some leads or perhaps point me to the relevant literature on this it’d be greatly appreciated!

Cheers!