Web: http://arxiv.org/abs/2205.05967

May 13, 2022, 1:10 a.m. | S.H.Shabbeer Basha, Debapriya Tula, Sravan Kumar Vinakota, Shiv Ram Dubey

cs.CV updates on arXiv.org arxiv.org

Transfer Learning enables Convolutional Neural Networks (CNN) to acquire
knowledge from a source domain and transfer it to a target domain, where
collecting large-scale annotated examples is both time-consuming and expensive.
Conventionally, while transferring the knowledge learned from one task to
another task, the deeper layers of a pre-trained CNN are finetuned over the
target dataset. However, these layers that are originally designed for the
source task are over-parameterized for the target task. Thus, finetuning these
layers over the target …

architecture arxiv compression cv knowledge network network architecture search transfer

More from arxiv.org / cs.CV updates on arXiv.org

Director, Applied Mathematics & Computational Research Division

@ Lawrence Berkeley National Lab | Berkeley, Ca

Business Data Analyst

@ MainStreet Family Care | Birmingham, AL

Assistant/Associate Professor of the Practice in Business Analytics

@ Georgetown University McDonough School of Business | Washington DC

Senior Data Science Writer

@ NannyML | Remote

Director of AI/ML Engineering

@ Armis Industries | Remote (US only), St. Louis, California

Digital Analytics Manager

@ Patagonia | Ventura, California