Please use this identifier to cite or link to this item:
http://hdl.handle.net/11375/29765
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.advisor | Ashtiani, Hassan | - |
dc.contributor.author | Afzali Kharkouei, Mohammad | - |
dc.date.accessioned | 2024-05-07T19:52:44Z | - |
dc.date.available | 2024-05-07T19:52:44Z | - |
dc.date.issued | 2024 | - |
dc.identifier.uri | http://hdl.handle.net/11375/29765 | - |
dc.description.abstract | We develop a general technique for estimating (mixture) distributions under the constraint of differential privacy (DP). On a high level, we show that if a class of distributions (such as Gaussians) is (1) list decodable and (2) admits a “locally small” cover (Bun et al., 2021) with respect to total variation distance, then the class of its mixtures is privately learnable. The proof circumvents a known barrier indicating that, unlike Gaussians, GMMs do not admit a locally small cover (Aden-Ali et al., 2021b). As the main application, we study the problem of privately estimating mixtures of Gaussians. Our main result is that poly(k, d, 1/α, 1/ε, log(1/δ)) samples are sufficient to estimate a mixture of k Gaussians in R^d up to total variation distance ``α'' while satisfying (ε, δ)-DP. This is the first finite sample complexity upper bound for the problem that does not make any structural assumptions on the GMMs. | en_US |
dc.language.iso | en_US | en_US |
dc.subject | Distribution Learning | en_US |
dc.subject | Differential Privacy | en_US |
dc.subject | Gaussian Mixture Models | en_US |
dc.subject | Density Estimation | en_US |
dc.title | PRIVATE DENSITY ESTIMATION FOR MIXTURE DISTRIBUTIONS AND GAUSSIAN MIXTURE MODELS | en_US |
dc.type | Thesis | en_US |
dc.contributor.department | Computing and Software | en_US |
dc.description.degreetype | Thesis | en_US |
dc.description.degree | Master of Science (MSc) | en_US |
dc.description.layabstract | The problem of distribution learning, also known as density estimation, has been extensively explored in Statistics over several decades. It involves the task of recovering the original distribution with minimal error given a set of samples from a distribution that belongs to a known family of distributions. More recently, a branch of research has emerged focusing on private distribution learning. This approach aims to learn a class of distributions while safeguarding the privacy of individuals in the dataset through the fulfillment of the gold standard of differential privacy. A fundamental open question in this domain is: Is there a class of distributions that can be learned without privacy considerations but not with privacy preservation? To address this question, we delve into the private learnability of the class of mixtures of Gaussians, which represents a diverse and complex set of distributions. | en_US |
Appears in Collections: | Open Access Dissertations and Theses |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
Afzali Kharkouei_Mohammad_202404_MSc.pdf | 477.71 kB | Adobe PDF | View/Open |
Items in MacSphere are protected by copyright, with all rights reserved, unless otherwise indicated.