Mixture

Mixture(w, comp_dists, *args, **kwargs) Mixture log-likelihood
NormalMixture(w, mu[, comp_shape]) Normal mixture log-likelihood
class pymc3.distributions.mixture.Mixture(w, comp_dists, *args, **kwargs)

Mixture log-likelihood

Often used to model subpopulation heterogeneity

\[f(x \mid w, \theta) = \sum_{i = 1}^n w_i f_i(x \mid \theta_i)\]
Support \(\cap_{i = 1}^n \textrm{support}(f_i)\)
Mean \(\sum_{i = 1}^n w_i \mu_i\)
Parameters:

w : array of floats

w >= 0 and w <= 1 the mixture weights

comp_dists : multidimensional PyMC3 distribution (e.g. pm.Poisson.dist(…))

or iterable of one-dimensional PyMC3 distributions the component distributions \(f_1, \ldots, f_n\)

class pymc3.distributions.mixture.NormalMixture(w, mu, comp_shape=(), *args, **kwargs)

Normal mixture log-likelihood

\[f(x \mid w, \mu, \sigma^2) = \sum_{i = 1}^n w_i N(x \mid \mu_i, \sigma^2_i)\]
Support \(x \in \mathbb{R}\)
Mean \(\sum_{i = 1}^n w_i \mu_i\)
Variance \(\sum_{i = 1}^n w_i^2 \sigma^2_i\)
Parameters:

w : array of floats

w >= 0 and w <= 1 the mixture weights

mu : array of floats

the component means

sd : array of floats

the component standard deviations

tau : array of floats

the component precisions

comp_shape : shape of the Normal component

notice that it should be different than the shape of the mixture distribution, with one axis being the number of components.

Note: You only have to pass in sd or tau, but not both.

pymc3.distributions.mixture.all_discrete(comp_dists)

Determine if all distributions in comp_dists are discrete