March 5, 2024, 2:41 p.m. | Milin Zhang, Mohammad Abdi, Shahriar Rifat, Francesco Restuccia

cs.LG updates on arXiv.org arxiv.org

arXiv:2403.00942v1 Announce Type: new
Abstract: Distributed deep neural networks (DNNs) have emerged as a key technique to reduce communication overhead without sacrificing performance in edge computing systems. Recently, entropy coding has been introduced to further reduce the communication overhead. The key idea is to train the distributed DNN jointly with an entropy model, which is used as side information during inference time to adaptively encode latent representations into bit streams with variable length. To the best of our knowledge, the …

abstract arxiv coding communication computing computing systems cs.ai cs.cr cs.lg distributed dnn edge edge computing entropy key networks neural networks performance reduce resilience systems the key train type

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Data Engineer (m/f/d)

@ Project A Ventures | Berlin, Germany

Principle Research Scientist

@ Analog Devices | US, MA, Boston