July 7, 2022, 1:10 a.m. | Samuel Cognolato, Alberto Testolin

cs.LG updates on arXiv.org arxiv.org

Mathematical reasoning is one of the most impressive achievements of human
intellect but remains a formidable challenge for artificial intelligence
systems. In this work we explore whether modern deep learning architectures can
learn to solve a symbolic addition task by discovering effective arithmetic
procedures. Although the problem might seem trivial at first glance,
generalizing arithmetic knowledge to operations involving a higher number of
terms, possibly composed by longer sequences of digits, has proven extremely
challenging for neural networks. Here we …

arxiv attention elementary lg local attention representation transformers

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne