HyperAIHyperAI
2 months ago

SmoothNets: Optimizing CNN architecture design for differentially private deep learning

Remerscheid, Nicolas W. ; Ziller, Alexander ; Rueckert, Daniel ; Kaissis, Georgios
SmoothNets: Optimizing CNN architecture design for differentially
  private deep learning
Abstract

The arguably most widely employed algorithm to train deep neural networkswith Differential Privacy is DPSGD, which requires clipping and noising ofper-sample gradients. This introduces a reduction in model utility compared tonon-private training. Empirically, it can be observed that this accuracydegradation is strongly dependent on the model architecture. We investigatedthis phenomenon and, by combining components which exhibit good individualperformance, distilled a new model architecture termed SmoothNet, which ischaracterised by increased robustness to the challenges of DP-SGD training.Experimentally, we benchmark SmoothNet against standard architectures on twobenchmark datasets and observe that our architecture outperforms others,reaching an accuracy of 73.5\% on CIFAR-10 at $\varepsilon=7.0$ and 69.2\% at$\varepsilon=7.0$ on ImageNette, a state-of-the-art result compared to priorarchitectural modifications for DP.