all AI news
Conditional set generation using Seq2seq models. (arXiv:2205.12485v2 [cs.CL] UPDATED)
Oct. 25, 2022, 1:19 a.m. | Aman Madaan, Dheeraj Rajagopal, Niket Tandon, Yiming Yang, Antoine Bosselut
cs.CL updates on arXiv.org arxiv.org
Conditional set generation learns a mapping from an input sequence of tokens
to a set. Several NLP tasks, such as entity typing and dialogue emotion
tagging, are instances of set generation. Seq2Seq models, a popular choice for
set generation, treat a set as a sequence and do not fully leverage its key
properties, namely order-invariance and cardinality. We propose a novel
algorithm for effectively sampling informative orders over the combinatorial
space of label orders. We jointly model the set cardinality …
More from arxiv.org / cs.CL updates on arXiv.org
Jobs in AI, ML, Big Data
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
DevOps Engineer (Data Team)
@ Reward Gateway | Sofia/Plovdiv