Nov. 7, 2022, 2:14 a.m. | Mirco Ramo, Guénolé C.M. Silvestre

cs.CV updates on arXiv.org arxiv.org

The Transformer architecture is shown to provide a powerful framework as an
end-to-end model for building expression trees from online handwritten gestures
corresponding to glyph strokes. In particular, the attention mechanism was
successfully used to encode, learn and enforce the underlying syntax of
expressions creating latent representations that are correctly decoded to the
exact mathematical expression tree, providing robustness to ablated inputs and
unseen glyphs. For the first time, the encoder is fed with spatio-temporal data
tokens potentially forming an …

architecture arxiv gesture recognition transformer transformer architecture

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Software Engineer, Data Tools - Full Stack

@ DoorDash | Pune, India

Senior Data Analyst

@ Artsy | New York City