IEEE Access (Jan 2020)

Effective Activation Functions for Homomorphic Evaluation of Deep Neural Networks

  • Srinath Obla,
  • Xinghan Gong,
  • Asma Aloufi,
  • Peizhao Hu,
  • Daniel Takabi

DOI
https://doi.org/10.1109/ACCESS.2020.3017436
Journal volume & issue
Vol. 8
pp. 153098 – 153112

Abstract

Read online

CryptoNets and subsequent work have demonstrated the capability of homomorphic encryption (HE) in the applications of private artificial intelligence (AI). In convolutional neural networks (CNNs), many computations are linear functions such as the convolution layer which can be homomorphically evaluated. However, there are layers such as the activation layer which is comprised of non-linear functions that cannot be homomorphically evaluated. One of the most commonly used methods is approximating these non-linear functions using low-degree polynomials. However, using the approximated polynomials as activation functions introduces errors which could have a significant impact on accuracy in classification tasks. In this paper, we present a systematic method to construct HE-friendly activation functions for CNNs. We first determine what properties in a good activation function contribute to performance by analyzing commonly used functions such as Rectified Linear Units (ReLU) and Sigmoid. Then, we compare polynomial approximation methods and search for an optimal range of approximation for the polynomial activation. We also propose a novel weighted polynomial approximation method tailored to the output distribution of a batch normalization layer. Finally, we demonstrate the effectiveness of our method using several datasets such as MNIST, FMNIST, CIFAR-10.

Keywords