Transactions on Cryptographic Hardware and Embedded Systems (Mar 2024)

SHAPER: A General Architecture for Privacy-Preserving Primitives in Secure Machine Learning

  • Ziyuan Liang,
  • Qi’ao Jin,
  • Zhiyong Wang,
  • Zhaohui Chen,
  • Zhen Gu,
  • Yanhheng Lu,
  • Fan Zhang

DOI
https://doi.org/10.46586/tches.v2024.i2.819-843
Journal volume & issue
Vol. 2024, no. 2

Abstract

Read online

Secure multi-party computation and homomorphic encryption are two primary security primitives in privacy-preserving machine learning, whose wide adoption is, nevertheless, constrained by the computation and network communication overheads. This paper proposes a hybrid Secret-sharing and Homomorphic encryption Architecture for Privacy-pERsevering machine learning (SHAPER). SHAPER protects sensitive data in encrypted or randomly shared domains instead of relying on a trusted third party. The proposed algorithm-protocol-hardware co-design methodology explores techniques such as plaintext Single Instruction Multiple Data (SIMD) and fine-grained scheduling, to minimize end-to-end latency in various network settings. SHAPER also supports secure domain computing acceleration and the conversion between mainstream privacy-preserving primitives, making it ready for general and distinctive data characteristics. SHAPER is evaluated by FPGA prototyping with a comprehensive hyper-parameter exploration, demonstrating a 94x speed-up over CPU clusters on large-scale logistic regression training tasks.

Keywords