Feb. 20, 2024, 5:41 a.m. | Chi Zhang, Man Ho Au, Siu Ming Yiu

cs.LG updates on arXiv.org arxiv.org

arXiv:2402.11224v1 Announce Type: new
Abstract: Replacing non-polynomial functions (e.g., non-linear activation functions such as ReLU) in a neural network with their polynomial approximations is a standard practice in privacy-preserving machine learning. The resulting neural network, called polynomial approximation of neural network (PANN) in this paper, is compatible with advanced cryptosystems to enable privacy-preserving model inference. Using ``highly precise'' approximation, state-of-the-art PANN offers similar inference accuracy as the underlying backbone model. However, little is known about the effect of approximation, and …

abstract accuracy approximation arxiv cs.cr cs.lg functions improvement insights linear low machine machine learning network networks neural network neural networks non-linear paper polynomial practice precision privacy relu standard type

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Senior Data Engineer

@ Quantexa | Sydney, New South Wales, Australia

Staff Analytics Engineer

@ Warner Bros. Discovery | NY New York 230 Park Avenue South