June 17, 2022, 1:13 a.m. | Adrian El Baz, André Carvalho, Hong Chen, Fabio Ferreira, Henry Gouk, Shell Hu, Frank Hutter, Zhengying Liu, Felix Mohr, Jan van Rijn, Xin Wang,

cs.CV updates on arXiv.org arxiv.org

Although deep neural networks are capable of achieving performance superior
to humans on various tasks, they are notorious for requiring large amounts of
data and computing resources, restricting their success to domains where such
resources are available. Metalearning methods can address this problem by
transferring knowledge from related tasks, thus reducing the amount of data and
computing resources needed to learn new tasks. We organize the MetaDL
competition series, which provide opportunities for research groups all over
the world to …

arxiv challenge classification few-shot learning fine-tuning image learning lessons learned lg meta meta-learning neurips neurips 2021

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Data Analytics & Insight Specialist, Customer Success

@ Fortinet | Ottawa, ON, Canada

Account Director, ChatGPT Enterprise - Majors

@ OpenAI | Remote - Paris