This paper introduces the first provably accurate algorithms for
differentially private, top-down decision tree learning in the distributed
setting (Balcan et al., 2012). We propose DP-TopDown, a general privacy
preserving decision tree learning algorithm, and present two distributed
implementations. Our first method NoisyCounts naturally extends the single
machine algorithm by using the Laplace mechanism. Our second method LocalRNM
significantly reduces communication and added noise by performing local
optimization at each data holder. We provide the first utility guarantees for
differentially private top-down decision tree learning in both the single
machine and distributed settings. These guarantees show that the error of the
privately-learned decision tree quickly goes to zero provided that the dataset
is sufficiently large. Our extensive experiments on real datasets illustrate
the trade-offs of privacy, accuracy and generalization when learning private
decision trees in the distributed setting.

Go to Source of this post
Author Of this post: <a href="">Kaiwen Wang</a>, <a href="">Travis Dick</a>, <a href="">Maria-Florina Balcan</a>

By admin