April 24, 2023, 12:45 a.m. | Ziyu Wang, Yuting Wu, Yongmo Park, Sangmin Yoo, Xinxin Wang, Jason K. Eshraghian, Wei D. Lu

cs.LG updates on arXiv.org arxiv.org

Analog compute-in-memory (CIM) accelerators are becoming increasingly popular
for deep neural network (DNN) inference due to their energy efficiency and
in-situ vector-matrix multiplication (VMM) capabilities. However, as the use of
DNNs expands, protecting user input privacy has become increasingly important.
In this paper, we identify a security vulnerability wherein an adversary can
reconstruct the user's private input data from a power side-channel attack,
under proper data acquisition and pre-processing, even without knowledge of the
DNN model. We further demonstrate a …

acquisition analog arxiv become compute data deep neural network dnn efficiency energy energy efficiency identify inference knowledge machine machine learning matrix matrix multiplication memory network neural network paper popular power privacy processing security side-channel attack vector vulnerability

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Machine Learning Engineer (m/f/d)

@ StepStone Group | Düsseldorf, Germany

2024 GDIA AI/ML Scientist - Supplemental

@ Ford Motor Company | United States