all AI news
An Empirical Study Of Self-supervised Learning Approaches For Object Detection With Transformers. (arXiv:2205.05543v1 [cs.CV])
cs.LG updates on arXiv.org arxiv.org
Self-supervised learning (SSL) methods such as masked language modeling have
shown massive performance gains by pretraining transformer models for a variety
of natural language processing tasks. The follow-up research adapted similar
methods like masked image modeling in vision transformer and demonstrated
improvements in the image classification task. Such simple self-supervised
methods are not exhaustively studied for object detection transformers (DETR,
Deformable DETR) as their transformer encoder modules take input in the
convolutional neural network (CNN) extracted feature space rather than …
arxiv cv detection learning self-supervised learning study supervised learning transformers