Polynomial time and private learning of unbounded Gaussian Mixture Models
Loading...
Date
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
We develop a technique for privately estimating the parameters of a mixture distribution
by reducing the problem to its non-private counterpart. This technique allows us
to privatize existing non-private algorithms in a BlackBox manner while only incurring
a small overhead in sample complexity and running time.
As the main application of our framework, we develop an algorithm for privately
learning mixtures of Gaussians using the non-private algorithm of Moitra and Valiant
[MV10] as a BlackBox and incurs only a polynomial time overhead in the sample complexity
and computational complexity. As a result, this gives the first sample complexity
upper bound and the first polynomial time algorithm in d for learning the parameters of
the Gaussian Mixture Models privately without requiring any boundedness assumptions
on the parameters.
To prove the results we introduced Private Populous Estimator (PPE) which is a
generalized version of the one used in [AL22] to achieve (ϵ, δ)-differential privacy. We
also develop a new masking mechanism for a single Gaussian component. Then we
introduce a general recipe to turn a masking mechanism for a component into a masking
mechanism for mixtures.