May 8, 2023, 12:46 a.m. | H M Dipu Kabir

cs.CV updates on arXiv.org arxiv.org

Multitask learning is a popular approach to training high-performing neural
networks with improved generalization. In this paper, we propose a background
class to achieve improved generalization at a lower computation compared to
multitask learning to help researchers and organizations with limited
computation power. We also present a methodology for selecting background
images and discuss potential future improvements. We apply our approach to
several datasets and achieved improved generalization with much lower
computation. We also investigate class activation mappings (CAMs) of …

arxiv computation information methodology multitask learning networks neural networks organizations paper popular power researchers training uncertainty

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Sr. BI Analyst

@ AkzoNobel | Pune, IN