Web: http://arxiv.org/abs/2206.07587

June 16, 2022, 1:12 a.m. | Pere-Lluís Huguet Cabot, Abelardo Carlos Martínez Lorenzo, Roberto Navigli

cs.CL updates on arXiv.org arxiv.org

With the surge of Transformer models, many have investigated how attention
acts on the learned representations. However, attention is still overlooked for
specific tasks, such as Semantic Parsing. A popular approach to the formal
representation of a sentence's meaning is Abstract Meaning Representation
(AMR). Until now, the alignment between a sentence and its AMR representation
has been explored in different ways, such as through rules or via the
Expectation Maximization (EM) algorithm. In this paper, we investigate the
ability of …

arxiv attention cross

More from arxiv.org / cs.CL updates on arXiv.org

Machine Learning Researcher - Saalfeld Lab

@ Howard Hughes Medical Institute - Chevy Chase, MD | Ashburn, Virginia

Project Director, Machine Learning in US Health

@ ideas42.org | Remote, US

Data Science Intern

@ NannyML | Remote

Machine Learning Engineer NLP/Speech

@ Play.ht | Remote

Research Scientist, 3D Reconstruction

@ Yembo | Remote, US

Clinical Assistant or Associate Professor of Management Science and Systems

@ University at Buffalo | Buffalo, NY