Sept. 28, 2022, 1:15 a.m. | Youngkee Kim, Won Joon Yun, Youn Kyu Lee, Soyi Jung, Joongheon Kim

cs.CV updates on arXiv.org arxiv.org

In many deep neural network (DNN) applications, the difficulty of gathering
high-quality data in the industry field hinders the practical use of DNN. Thus,
the concept of transfer learning has emerged, which leverages the pretrained
knowledge of DNNs trained on large-scale datasets. Therefore, this paper
suggests two-stage architectural fine-tuning, inspired by neural architecture
search (NAS). One of main ideas is mutation, which reduces the search cost
using given architectural information. Moreover, early-stopping is considered
which cuts NAS costs by terminating …

architecture arxiv classification early-stopping fine-tuning image neural architecture search search stage

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Data Science Analyst

@ Mayo Clinic | AZ, United States

Sr. Data Scientist (Network Engineering)

@ SpaceX | Redmond, WA