all AI news
PLOG: Table-to-Logic Pretraining for Logical Table-to-Text Generation. (arXiv:2205.12697v1 [cs.CL])
May 26, 2022, 1:12 a.m. | Ao Liu, Haoyu Dong, Naoaki Okazaki, Shi Han, Dongmei Zhang
cs.CL updates on arXiv.org arxiv.org
Logical table-to-text generation is a task that involves generating logically
faithful sentences from tables, which requires models to derive logical level
facts from table records via logical inference. It raises a new challenge on
the logical-level content planning of table-to-text models. However, directly
learning the logical inference knowledge from table-text pairs is very
difficult for neural models because of the ambiguity of natural language and
the scarcity of parallel data. Hence even large-scale pre-trained language
models present low logical fidelity …
More from arxiv.org / cs.CL updates on arXiv.org
Jobs in AI, ML, Big Data
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Applied Scientist, Control Stack, AWS Center for Quantum Computing
@ Amazon.com | Pasadena, California, USA
Specialist Marketing with focus on ADAS/AD f/m/d
@ AVL | Graz, AT
Machine Learning Engineer, PhD Intern
@ Instacart | United States - Remote
Supervisor, Breast Imaging, Prostate Center, Ultrasound
@ University Health Network | Toronto, ON, Canada
Senior Manager of Data Science (Recommendation Science)
@ NBCUniversal | New York, NEW YORK, United States