all AI news
Input Selection for Bandwidth-Limited Neural Network Inference. (arXiv:1906.04673v2 [cs.LG] UPDATED)
Jan. 20, 2022, 2:10 a.m. | Stefan Oehmcke, Fabian Gieseke
cs.LG updates on arXiv.org arxiv.org
Data are often accommodated on centralized storage servers. This is the case,
for instance, in remote sensing and astronomy, where projects produce several
petabytes of data every year. While machine learning models are often trained
on relatively small subsets of the data, the inference phase typically requires
transferring significant amounts of data between the servers and the clients.
In many cases, the bandwidth available per user is limited, which then renders
the data transfer to be one of the major …
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Senior Business Intelligence Developer / Analyst
@ Transamerica | Work From Home, USA
Data Analyst (All Levels)
@ Noblis | Bethesda, MD, United States