Web: http://arxiv.org/abs/2202.05022

May 12, 2022, 1:12 a.m. | Pratik Kumar, Ankita Nandi, Shantanu Chakrabartty, Chetan Singh Thakur

cs.LG updates on arXiv.org arxiv.org

While analog computing is attractive for implementing machine learning (ML)
processors, the paradigm requires chip-in-the-loop training for every processor
to alleviate artifacts due to device mismatch and device non-linearity.
Speeding up chip-in-the-loop training requires re-biasing the circuits in a
manner that the analog functions remain invariant across training and
inference. In this paper, we present an analog computational paradigm and
circuits using "shape" functions that remain invariant to transistor biasing
(weak, moderate, and strong inversion) and ambient temperature variation. We …

arxiv cross learning machine machine learning

More from arxiv.org / cs.LG updates on arXiv.org

Data Analyst, Patagonia Action Works

@ Patagonia | Remote

Data & Insights Strategy & Innovation General Manager

@ Chevron Services Company, a division of Chevron U.S.A Inc. | Houston, TX

Faculty members in Research areas such as Bayesian and Spatial Statistics; Data Privacy and Security; AI/ML; NLP; Image and Video Data Analysis

@ Ahmedabad University | Ahmedabad, India

Director, Applied Mathematics & Computational Research Division

@ Lawrence Berkeley National Lab | Berkeley, Ca

Business Data Analyst

@ MainStreet Family Care | Birmingham, AL

Assistant/Associate Professor of the Practice in Business Analytics

@ Georgetown University McDonough School of Business | Washington DC