Sept. 30, 2022, 1:15 a.m. | Anthony Fuller, Koreen Millard, James R. Green

cs.CV updates on arXiv.org arxiv.org

Although the remote sensing (RS) community has begun to pretrain transformers
(intended to be fine-tuned on RS tasks), it is unclear how these models perform
under distribution shifts. Here, we pretrain a new RS transformer--called
SatViT-V2--on 1.3 million satellite-derived RS images, then fine-tune it (along
with five other models) to investigate how it performs on distributions not
seen during training. We split an expertly labeled land cover dataset into 14
datasets based on source biome. We train each model on …

arxiv remote sensing transfer transfer learning transformers

Senior Machine Learning Engineer

@ GPTZero | Toronto, Canada

ML/AI Engineer / NLP Expert - Custom LLM Development (x/f/m)

@ HelloBetter | Remote

Doctoral Researcher (m/f/div) in Automated Processing of Bioimages

@ Leibniz Institute for Natural Product Research and Infection Biology (Leibniz-HKI) | Jena

Seeking Developers and Engineers for AI T-Shirt Generator Project

@ Chevon Hicks | Remote

Principal Data Architect - Azure & Big Data

@ MGM Resorts International | Home Office - US, NV

GN SONG MT Market Research Data Analyst 11

@ Accenture | Bengaluru, BDC7A