Nov. 11, 2022, 2:12 a.m. | C. Donoso-Oliva, I. Becker, P. Protopapas, G. Cabrera-Vives, Vishnu M., Harsh Vardhan

cs.LG updates on arXiv.org arxiv.org

Taking inspiration from natural language embeddings, we present ASTROMER, a
transformer-based model to create representations of light curves. ASTROMER was
pre-trained in a self-supervised manner, requiring no human-labeled data. We
used millions of R-band light sequences to adjust the ASTROMER weights. The
learned representation can be easily adapted to other surveys by re-training
ASTROMER on new sources. The power of ASTROMER consists of using the
representation to extract light curve embeddings that can enhance the training
of other models, such …

arxiv astro embedding light representation transformer

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne