Nov. 9, 2022, 2:12 a.m. | François Charton

cs.LG updates on arXiv.org arxiv.org

Transformers can learn to perform numerical computations from examples only.
I study nine problems of linear algebra, from basic matrix operations to
eigenvalue decomposition and inversion, and introduce and discuss four encoding
schemes to represent real numbers. On all problems, transformers trained on
sets of random matrices achieve high accuracies (over 90%). The models are
robust to noise, and can generalize out of their training distribution. In
particular, models trained to predict Laplace-distributed eigenvalues
generalize to different classes of matrices: …

algebra arxiv linear linear algebra transformers

Data Scientist (m/f/x/d)

@ Symanto Research GmbH & Co. KG | Spain, Germany

Enterprise Data Architect

@ Pathward | Remote

Diagnostic Imaging Information Systems (DIIS) Technologist

@ Nova Scotia Health Authority | Halifax, NS, CA, B3K 6R8

Intern Data Scientist - Residual Value Risk Management (f/m/d)

@ BMW Group | Munich, DE

Analytics Engineering Manager

@ PlayStation Global | United Kingdom, London

Junior Insight Analyst (PR&Comms)

@ Signal AI | Lisbon, Lisbon, Portugal