all AI news
Reduction of Class Activation Uncertainty with Background Information. (arXiv:2305.03238v1 [cs.CV])
cs.CV updates on arXiv.org arxiv.org
Multitask learning is a popular approach to training high-performing neural
networks with improved generalization. In this paper, we propose a background
class to achieve improved generalization at a lower computation compared to
multitask learning to help researchers and organizations with limited
computation power. We also present a methodology for selecting background
images and discuss potential future improvements. We apply our approach to
several datasets and achieved improved generalization with much lower
computation. We also investigate class activation mappings (CAMs) of …
arxiv computation information methodology multitask learning networks neural networks organizations paper popular power researchers training uncertainty