Dec. 6, 2023, 11:19 a.m. | /u/APaperADay

Machine Learning www.reddit.com

**arXiv**: [https://arxiv.org/abs/2310.17653](https://arxiv.org/abs/2310.17653)

**OpenReview**: [https://openreview.net/forum?id=m50eKHCttz](https://openreview.net/forum?id=m50eKHCttz)

**Abstract**:

>Training deep networks requires various design decisions regarding for instance their architecture, data augmentation, or optimization. In this work, we find these training variations to result in networks learning unique feature sets from the data. Using public model libraries comprising thousands of models trained on canonical datasets like ImageNet, we observe that for arbitrary pairings of pretrained models, one model extracts significant data context unavailable in the other -- independent of overall performance. Given any …

abstract architecture augmentation canonical data datasets decisions design feature imagenet instance libraries machinelearning networks observe optimization pretrained models public training work

Data Scientist (m/f/x/d)

@ Symanto Research GmbH & Co. KG | Spain, Germany

Associate Data Analyst

@ Gartner | Stamford - 56 Top Gallant

Ecologist III (Wetland Scientist III)

@ AECOM | Pittsburgh, PA, United States

Senior Data Analyst

@ Publicis Groupe | Bengaluru, India

Data Analyst

@ Delivery Hero | Hong Kong, Hong Kong

Senior Data Engineer

@ ChargePoint | Bengaluru, India