all AI news
Understanding The Effectiveness of Lossy Compression in Machine Learning Training Sets
March 26, 2024, 4:41 a.m. | Robert Underwood, Jon C. Calhoun, Sheng Di, Franck Cappello
cs.LG updates on arXiv.org arxiv.org
Abstract: Learning and Artificial Intelligence (ML/AI) techniques have become increasingly prevalent in high performance computing (HPC). However, these methods depend on vast volumes of floating point data for training and validation which need methods to share the data on a wide area network (WAN) or to transfer it from edge devices to data centers. Data compression can be a solution to these problems, but an in-depth understanding of how lossy compression affects model quality is needed. …
abstract artificial artificial intelligence arxiv become compression computing cs.ai cs.lg data floating point high performance computing however hpc intelligence machine machine learning network performance training type understanding validation vast
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Data Engineer (m/f/d)
@ Project A Ventures | Berlin, Germany
Principle Research Scientist
@ Analog Devices | US, MA, Boston