all AI news
Conditional set generation using Seq2seq models. (arXiv:2205.12485v1 [cs.CL])
May 26, 2022, 1:11 a.m. | Aman Madaan, Dheeraj Rajagopal, Niket Tandon, Yiming Yang, Antoine Bosselut
cs.CL updates on arXiv.org arxiv.org
Conditional set generation learns a mapping from an input sequence of tokens
to a set. Several NLP tasks, such as entity typing and dialogue emotion
tagging, are instances of set generation. Sequence-to-sequence~(Seq2seq) models
are a popular choice to model set generation, but they treat a set as a
sequence and do not fully leverage its key properties, namely order-invariance
and cardinality. We propose a novel algorithm for effectively sampling
informative orders over the combinatorial space of label orders. Further, we …
More from arxiv.org / cs.CL updates on arXiv.org
Jobs in AI, ML, Big Data
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Data Scientist (Database Development)
@ Nasdaq | Bengaluru-Affluence