PrivateSNN: Fully Privacy-Preserving Spiking Neural Networks

How can we bring both privacy and energy-efficiency to a neural system on edge devices? In this paper, we propose PrivateSNN, which aims to build low-power Spiking Neural Networks (SNNs) from a pre-trained ANN model without leaking sensitive information contained in a dataset. Here, we tackle two types of leakage problems: 1) Data leakage caused when the networks access real training data during an ANN-SNN conversion process. 2) Class leakage is the concept of leakage caused when class-related features can be reconstructed from network parameters. In order to address the data leakage issue, we generate synthetic images from the pre-trained ANNs and convert ANNs to SNNs using generated images. However, converted SNNs are still vulnerable with respect to the class leakage since the weight parameters have the same (or scaled) value with respect to ANN parameters. Therefore, we encrypt SNN weights by training SNNs with a temporal spike-based learning rule. Updating weight parameters with temporal data makes networks difficult to be interpreted in the spatial domain. We observe that the encrypted PrivateSNN can be implemented not only without the huge performance drop (less than ~5%) but also with significant energy-efficiency gain (about x60 compared to the standard ANN). We conduct extensive experiments on various datasets including CIFAR10, CIFAR100, and TinyImageNet, highlighting the importance of privacy-preserving SNN training.

[1]  T. Bliss,et al.  A synaptic model of memory: long-term potentiation in the hippocampus , 1993, Nature.

[2]  Matthew Cook,et al.  Fast-classifying, high-accuracy spiking deep networks through weight and threshold balancing , 2015, 2015 International Joint Conference on Neural Networks (IJCNN).

[3]  Sungeun Hong,et al.  Towards Privacy-Preserving Domain Adaptation , 2020, IEEE Signal Processing Letters.

[4]  Luca Antiga,et al.  Automatic differentiation in PyTorch , 2017 .

[5]  Priyadarshini Panda,et al.  Revisiting Batch Normalization for Training Low-latency Deep Spiking Neural Networks from Scratch , 2020, ArXiv.

[6]  Emre Neftci,et al.  Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-based optimization to spiking neural networks , 2019, IEEE Signal Processing Magazine.

[7]  Kaushik Roy,et al.  Going Deeper in Spiking Neural Networks: VGG and Residual Architectures , 2018, Front. Neurosci..

[8]  Andrew Zisserman,et al.  Very Deep Convolutional Networks for Large-Scale Image Recognition , 2014, ICLR.

[9]  Kaushik Roy,et al.  Enabling Spike-Based Backpropagation for Training Deep Neural Network Architectures , 2019, Frontiers in Neuroscience.

[10]  Somesh Jha,et al.  Model Inversion Attacks that Exploit Confidence Information and Basic Countermeasures , 2015, CCS.

[11]  Gang Pan,et al.  Deep CovDenseSNN: A hierarchical event-driven dynamic framework with spiking neurons in noisy environment , 2019, Neural Networks.

[12]  Garrick Orchard,et al.  Neuromorphic Nearest Neighbor Search Using Intel's Pohoiki Springs , 2020, NICE.

[13]  J. Knott The organization of behavior: A neuropsychological theory , 1951 .

[14]  Alex Krizhevsky,et al.  Learning Multiple Layers of Features from Tiny Images , 2009 .

[15]  Jiashi Feng,et al.  Do We Really Need to Access the Source Data? Source Hypothesis Transfer for Unsupervised Domain Adaptation , 2020, ICML.

[16]  Kaushik Roy,et al.  Inherent Adversarial Robustness of Deep Spiking Neural Networks: Effects of Discrete Input Encoding and Non-Linear Activations , 2020, ECCV.

[17]  R. Venkatesh Babu,et al.  Ask, Acquire, and Attack: Data-free UAP Generation using Class Impressions , 2018, ECCV.

[18]  Jiashi Feng,et al.  Revisit Knowledge Distillation: a Teacher-free Framework , 2019, ArXiv.

[19]  G. Bi,et al.  Synaptic Modifications in Cultured Hippocampal Neurons: Dependence on Spike Timing, Synaptic Strength, and Postsynaptic Cell Type , 1998, The Journal of Neuroscience.

[20]  Elad Hoffer,et al.  The Knowledge Within: Methods for Data-Free Model Compression , 2020, 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).

[21]  Sergey Ioffe,et al.  Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift , 2015, ICML.

[22]  Sungroh Yoon,et al.  Spiking-YOLO: Spiking Neural Network for Energy-Efficient Object Detection , 2020, AAAI.

[23]  Li Fei-Fei,et al.  ImageNet: A large-scale hierarchical image database , 2009, CVPR.

[24]  Kaushik Roy,et al.  Enabling Deep Spiking Neural Networks with Hybrid Conversion and Spike Timing Dependent Backpropagation , 2020, ICLR.

[25]  Sepp Hochreiter,et al.  GANs Trained by a Two Time-Scale Update Rule Converge to a Local Nash Equilibrium , 2017, NIPS.

[26]  Jonathon Shlens,et al.  Explaining and Harnessing Adversarial Examples , 2014, ICLR.

[27]  Jian Sun,et al.  Deep Residual Learning for Image Recognition , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[28]  Bo Zhao,et al.  iDLG: Improved Deep Leakage from Gradients , 2020, ArXiv.

[29]  Nitish Srivastava,et al.  Dropout: a simple way to prevent neural networks from overfitting , 2014, J. Mach. Learn. Res..

[30]  Arijit Raychowdhury,et al.  A Swarm Optimization Solver Based on Ferroelectric Spiking Neural Networks , 2019, Front. Neurosci..

[31]  Matthew Cook,et al.  Unsupervised learning of digit recognition using spike-timing-dependent plasticity , 2015, Front. Comput. Neurosci..

[32]  Hong Wang,et al.  Loihi: A Neuromorphic Manycore Processor with On-Chip Learning , 2018, IEEE Micro.

[33]  Priyadarshini Panda,et al.  Visual explanations from spiking neural networks using inter-spike intervals , 2021, Scientific Reports.

[34]  R. Venkatesh Babu,et al.  Zero-Shot Knowledge Distillation in Deep Networks , 2019, ICML.

[35]  Shiliang Pu,et al.  A Free Lunch for Unsupervised Domain Adaptive Object Detection without Source Data , 2020, AAAI.

[36]  King-Sun Fu,et al.  IEEE Transactions on Pattern Analysis and Machine Intelligence Publication Information , 2004, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[37]  Yoshua Bengio,et al.  Generative Adversarial Nets , 2014, NIPS.

[38]  Mark Horowitz,et al.  1.1 Computing's energy problem (and what we can do about it) , 2014, 2014 IEEE International Solid-State Circuits Conference Digest of Technical Papers (ISSCC).

[39]  Ross B. Girshick,et al.  Fast R-CNN , 2015, 1504.08083.

[40]  Dawn Song,et al.  The Secret Revealer: Generative Model-Inversion Attacks Against Deep Neural Networks , 2020, 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).

[41]  Song Han,et al.  Deep Leakage from Gradients , 2019, NeurIPS.

[42]  Yuichi Yoshida,et al.  Spectral Normalization for Generative Adversarial Networks , 2018, ICLR.

[43]  Hod Lipson,et al.  Understanding Neural Networks Through Deep Visualization , 2015, ArXiv.

[44]  R. Venkatesh Babu,et al.  Universal Source-Free Domain Adaptation , 2020, 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).

[45]  Kaushik Roy,et al.  Deep Spiking Neural Network: Energy Efficiency Through Time Based Coding , 2020, ECCV.

[46]  Takeru Miyato,et al.  cGANs with Projection Discriminator , 2018, ICLR.

[47]  Kaushik Roy,et al.  Towards spike-based machine intelligence with neuromorphic computing , 2019, Nature.

[48]  Steve B. Furber,et al.  The SpiNNaker Project , 2014, Proceedings of the IEEE.

[49]  Tobi Delbrück,et al.  Training Deep Spiking Neural Networks Using Backpropagation , 2016, Front. Neurosci..

[50]  Shih-Chii Liu,et al.  Conversion of Continuous-Valued Deep Networks to Efficient Event-Driven Networks for Image Classification , 2017, Front. Neurosci..

[51]  Derek Hoiem,et al.  Learning without Forgetting , 2016, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[52]  Bernard Brezzo,et al.  TrueNorth: Design and Tool Flow of a 65 mW 1 Million Neuron Programmable Neurosynaptic Chip , 2015, IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems.

[53]  Kaushik Roy,et al.  RMP-SNN: Residual Membrane Potential Neuron for Enabling Deeper High-Accuracy and Low-Latency Spiking Neural Network , 2020, 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).

[54]  Lei Deng,et al.  Spatio-Temporal Backpropagation for Training High-Performance Spiking Neural Networks , 2017, Front. Neurosci..

[55]  Geoffrey E. Hinton,et al.  Distilling the Knowledge in a Neural Network , 2015, ArXiv.

[56]  Sungroh Yoon,et al.  Spiking-YOLO: Spiking Neural Network for Real-time Object Detection , 2019, ArXiv.