Aug. 10, 2023, 4:43 a.m. | Randy Harsuko, Tariq Alkhalifah

cs.LG updates on arXiv.org arxiv.org

StorSeismic is a recently introduced model based on the Transformer to adapt
to various seismic processing tasks through its pretraining and fine-tuning
training strategy. In the original implementation, StorSeismic utilized a
sinusoidal positional encoding and a conventional self-attention mechanism,
both borrowed from the natural language processing (NLP) applications. For
seismic processing they admitted good results, but also hinted to limitations
in efficiency and expressiveness. We propose modifications to these two key
components, by utilizing relative positional encoding and low-rank attention …

arxiv attention deep learning encoding fine-tuning geo implementation language language processing natural natural language natural language processing network nlp physics positional encoding processing self-attention strategy through training transformer workflow

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne