Abstract
We introduce S-BDT: a novel -differentially private distributed gradient boosted decision tree (GBDT) learner that improves the protection of single training data points (privacy) while achieving meaningful learning goals, such as accuracy or regression error (utility). S-BDT uses less noise by relying on non-spherical multivariate Gaussian noise, for which we show tight subsampling bounds for privacy amplification and incorporate that into a Rényi filter for individual privacy accounting. We experimentally reach the same utility while saving in terms of epsilon for on the Abalone regression dataset (dataset size ), saving in terms of epsilon for for the Adult classification dataset (dataset size ), and saving in terms of epsilon for for the Spambase classification dataset (dataset size ). Moreover, we show that for situations where a GBDT is learning a stream of data that originates from different subpopulations (non-IID), S-BDT improves the saving of epsilon even further.
| Original language | English |
|---|---|
| Publication status | Published - 16.08.2024 |
UN SDGs
This output contributes to the following UN Sustainable Development Goals (SDGs)
-
SDG 4 Quality Education
-
SDG 9 Industry, Innovation, and Infrastructure
-
SDG 11 Sustainable Cities and Communities
-
SDG 12 Responsible Consumption and Production
-
SDG 14 Life Below Water
-
SDG 15 Life on Land
Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver