all AI news
Revisiting Adversarial Training at Scale
April 23, 2024, 4:48 a.m. | Zeyu Wang, Xianhang Li, Hongru Zhu, Cihang Xie
cs.CV updates on arXiv.org arxiv.org
Abstract: The machine learning community has witnessed a drastic change in the training pipeline, pivoted by those ''foundation models'' with unprecedented scales. However, the field of adversarial training is lagging behind, predominantly centered around small model sizes like ResNet-50, and tiny and low-resolution datasets like CIFAR-10. To bridge this transformation gap, this paper provides a modern re-examination with adversarial training, investigating its potential benefits when applied at scale. Additionally, we introduce an efficient and effective training …
abstract adversarial adversarial training arxiv bridge change cifar-10 community cs.cv datasets foundation gap however low machine machine learning pipeline resnet resnet-50 resolution scale small training training pipeline transformation type
More from arxiv.org / cs.CV updates on arXiv.org
Compact 3D Scene Representation via Self-Organizing Gaussian Grids
1 day, 2 hours ago |
arxiv.org
Fingerprint Matching with Localized Deep Representation
1 day, 2 hours ago |
arxiv.org
Jobs in AI, ML, Big Data
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Data Engineer - Takealot Group (Takealot.com | Superbalist.com | Mr D Food)
@ takealot.com | Cape Town