Sept. 30, 2022, 1:15 a.m. | Anthony Fuller, Koreen Millard, James R. Green

cs.CV updates on arXiv.org arxiv.org

Although the remote sensing (RS) community has begun to pretrain transformers
(intended to be fine-tuned on RS tasks), it is unclear how these models perform
under distribution shifts. Here, we pretrain a new RS transformer--called
SatViT-V2--on 1.3 million satellite-derived RS images, then fine-tune it (along
with five other models) to investigate how it performs on distributions not
seen during training. We split an expertly labeled land cover dataset into 14
datasets based on source biome. We train each model on …

arxiv remote sensing transfer transfer learning transformers

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne