Feb. 8, 2024, 5:42 a.m. | Saeed Vahidian Mingyu Wang Jianyang Gu Vyacheslav Kungurtsev Wei Jiang Yiran Chen

cs.LG updates on arXiv.org arxiv.org

Dataset distillation (DD) has emerged as a widely adopted technique for crafting a synthetic dataset that captures the essential information of a training dataset, facilitating the training of accurate neural models. Its applications span various domains, including transfer learning, federated learning, and neural architecture search. The most popular methods for constructing the synthetic data rely on matching the convergence properties of training the model with the synthetic dataset and the training dataset. However, targeting the training dataset must be thought …

applications architecture cs.ai cs.cv cs.lg dataset distillation domains federated learning information neural architecture search popular risk robust search synthetic training transfer transfer learning

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Data Science Analyst

@ Mayo Clinic | AZ, United States

Sr. Data Scientist (Network Engineering)

@ SpaceX | Redmond, WA