all AI news
Technical Report for ICCV 2021 Challenge SSLAD-Track3B: Transformers Are Better Continual Learners. (arXiv:2201.04924v1 [cs.CV])
Jan. 14, 2022, 2:10 a.m. | Duo Li, Guimei Cao, Yunlu Xu, Zhanzhan Cheng, Yi Niu
cs.CV updates on arXiv.org arxiv.org
In the SSLAD-Track 3B challenge on continual learning, we propose the method
of COntinual Learning with Transformer (COLT). We find that transformers suffer
less from catastrophic forgetting compared to convolutional neural network. The
major principle of our method is to equip the transformer based feature
extractor with old knowledge distillation and head expanding strategies to
compete catastrophic forgetting. In this report, we first introduce the overall
framework of continual learning for object detection. Then, we analyse the key
elements' effect …
More from arxiv.org / cs.CV updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Data Management Associate
@ EcoVadis | Ebène, Mauritius
Senior Data Engineer
@ Telstra | Telstra ICC Bengaluru