Welcome to the upgraded MacSphere! We're putting the finishing touches on it; if you notice anything amiss, email macsphere@mcmaster.ca

PRIVATE DENSITY ESTIMATION FOR MIXTURE DISTRIBUTIONS AND GAUSSIAN MIXTURE MODELS

dc.contributor.advisorAshtiani, Hassan
dc.contributor.authorAfzali Kharkouei, Mohammad
dc.contributor.departmentComputing and Softwareen_US
dc.date.accessioned2024-05-07T19:52:44Z
dc.date.available2024-05-07T19:52:44Z
dc.date.issued2024
dc.description.abstractWe develop a general technique for estimating (mixture) distributions under the constraint of differential privacy (DP). On a high level, we show that if a class of distributions (such as Gaussians) is (1) list decodable and (2) admits a “locally small” cover (Bun et al., 2021) with respect to total variation distance, then the class of its mixtures is privately learnable. The proof circumvents a known barrier indicating that, unlike Gaussians, GMMs do not admit a locally small cover (Aden-Ali et al., 2021b). As the main application, we study the problem of privately estimating mixtures of Gaussians. Our main result is that poly(k, d, 1/α, 1/ε, log(1/δ)) samples are sufficient to estimate a mixture of k Gaussians in R^d up to total variation distance ``α'' while satisfying (ε, δ)-DP. This is the first finite sample complexity upper bound for the problem that does not make any structural assumptions on the GMMs.en_US
dc.description.degreeMaster of Science (MSc)en_US
dc.description.degreetypeThesisen_US
dc.description.layabstractThe problem of distribution learning, also known as density estimation, has been extensively explored in Statistics over several decades. It involves the task of recovering the original distribution with minimal error given a set of samples from a distribution that belongs to a known family of distributions. More recently, a branch of research has emerged focusing on private distribution learning. This approach aims to learn a class of distributions while safeguarding the privacy of individuals in the dataset through the fulfillment of the gold standard of differential privacy. A fundamental open question in this domain is: Is there a class of distributions that can be learned without privacy considerations but not with privacy preservation? To address this question, we delve into the private learnability of the class of mixtures of Gaussians, which represents a diverse and complex set of distributions.en_US
dc.identifier.urihttp://hdl.handle.net/11375/29765
dc.language.isoen_USen_US
dc.subjectDistribution Learningen_US
dc.subjectDifferential Privacyen_US
dc.subjectGaussian Mixture Modelsen_US
dc.subjectDensity Estimationen_US
dc.titlePRIVATE DENSITY ESTIMATION FOR MIXTURE DISTRIBUTIONS AND GAUSSIAN MIXTURE MODELSen_US
dc.typeThesisen_US

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Afzali Kharkouei_Mohammad_202404_MSc.pdf
Size:
477.71 KB
Format:
Adobe Portable Document Format

License bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
license.txt
Size:
1.68 KB
Format:
Item-specific license agreed upon to submission
Description: