Volume 7 - Issue 2
Robust Decentralized Dierentially Private Stochastic Gradient Descent
- Istvan Hegedus
MTA-SZTE Research Group on AI University of Szeged Szeged, Hungary
ihegedus@inf.u-szeged.hu
- Arpad Berta
MTA-SZTE Research Group on AI University of Szeged Szeged, Hungary
berta@inf.u-szeged.hu
- Mark Jelasity
MTA-SZTE Research Group on AI University of Szeged Szeged, Hungary
jelasity@inf.u-szeged.hu
Keywords: decentralized dierential privacy, stochastic gradient descent, machine learning, random walks
Abstract
Stochastic gradient descent (SGD) is one of the most applied machine learning algorithms in unreliable
large-scale decentralized environments. In this type of environment data privacy is a fundamental
concern. The most popular way to investigate this topic is based on the framework of
dierential privacy. However, many important implementation details and the performance of differentially
private SGD variants have not yet been completely addressed. Here, we analyze a set of
distributed dierentially private SGD implementations in a system, where every private data record
is stored separately by an autonomous node. The examined SGD methods apply only local computations
and communications contain only protected information in a dierentially private manner. A
key middleware service these implementations require is the single random walk service, where a
single random walk is maintained in the face of dierent failure scenarios. First we propose a robust
implementation for the decentralized single random walk service and then perform experiments to
evaluate the proposed random walk service as well as the private SGD implementations. Our main
conclusion here is that the proposed dierentially private SGD implementations can approximate the
performance of their original noise-free variants in faulty decentralized environments, provided the
algorithm parameters are set properly.