all AI news
SdAE: Self-distillated Masked Autoencoder. (arXiv:2208.00449v1 [cs.CV])
Aug. 2, 2022, 2:13 a.m. | Yabo Chen, Yuchen Liu, Dongsheng Jiang, Xiaopeng Zhang, Wenrui Dai, Hongkai Xiong, Qi Tian
cs.CV updates on arXiv.org arxiv.org
With the development of generative-based self-supervised learning (SSL)
approaches like BeiT and MAE, how to learn good representations by masking
random patches of the input image and reconstructing the missing information
has grown in concern. However, BeiT and PeCo need a "pre-pretraining" stage to
produce discrete codebooks for masked patches representing. MAE does not
require a pre-training codebook process, but setting pixels as reconstruction
targets may introduce an optimization gap between pre-training and downstream
tasks that good reconstruction quality may …
More from arxiv.org / cs.CV updates on arXiv.org
Jobs in AI, ML, Big Data
Software Engineer for AI Training Data (School Specific)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Python)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Tier 2)
@ G2i Inc | Remote
Data Engineer
@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US