all AI news
Dual Lottery Ticket Hypothesis. (arXiv:2203.04248v1 [cs.LG])
March 9, 2022, 2:11 a.m. | Yue Bai, Huan Wang, Zhiqiang Tao, Kunpeng Li, Yun Fu
cs.LG updates on arXiv.org arxiv.org
Fully exploiting the learning capacity of neural networks requires
overparameterized dense networks. On the other side, directly training sparse
neural networks typically results in unsatisfactory performance. Lottery Ticket
Hypothesis (LTH) provides a novel view to investigate sparse network training
and maintain its capacity. Concretely, it claims there exist winning tickets
from a randomly initialized network found by iterative magnitude pruning and
preserving promising trainability (or we say being in trainable condition). In
this work, we regard the winning ticket from …
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Data Science Analyst
@ Mayo Clinic | AZ, United States
Sr. Data Scientist (Network Engineering)
@ SpaceX | Redmond, WA