Please use this identifier to cite or link to this item:
http://hdl.handle.net/11375/29765
Title: | PRIVATE DENSITY ESTIMATION FOR MIXTURE DISTRIBUTIONS AND GAUSSIAN MIXTURE MODELS |
Authors: | Afzali Kharkouei, Mohammad |
Advisor: | Ashtiani, Hassan |
Department: | Computing and Software |
Keywords: | Distribution Learning;Differential Privacy;Gaussian Mixture Models;Density Estimation |
Publication Date: | 2024 |
Abstract: | We develop a general technique for estimating (mixture) distributions under the constraint of differential privacy (DP). On a high level, we show that if a class of distributions (such as Gaussians) is (1) list decodable and (2) admits a “locally small” cover (Bun et al., 2021) with respect to total variation distance, then the class of its mixtures is privately learnable. The proof circumvents a known barrier indicating that, unlike Gaussians, GMMs do not admit a locally small cover (Aden-Ali et al., 2021b). As the main application, we study the problem of privately estimating mixtures of Gaussians. Our main result is that poly(k, d, 1/α, 1/ε, log(1/δ)) samples are sufficient to estimate a mixture of k Gaussians in R^d up to total variation distance ``α'' while satisfying (ε, δ)-DP. This is the first finite sample complexity upper bound for the problem that does not make any structural assumptions on the GMMs. |
URI: | http://hdl.handle.net/11375/29765 |
Appears in Collections: | Open Access Dissertations and Theses |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
Afzali Kharkouei_Mohammad_202404_MSc.pdf | 477.71 kB | Adobe PDF | View/Open |
Items in MacSphere are protected by copyright, with all rights reserved, unless otherwise indicated.