Aug. 29, 2022, 1:13 a.m. | Guanming Xiong, Junwei Bao, Wen Zhao, Youzheng Wu, Xiaodong He

cs.CL updates on arXiv.org arxiv.org

This study investigates the task of knowledge-based question generation
(KBQG). Conventional KBQG works generated questions from fact triples in the
knowledge graph, which could not express complex operations like aggregation
and comparison in SPARQL. Moreover, due to the costly annotation of large-scale
SPARQL-question pairs, KBQG from SPARQL under low-resource scenarios urgently
needs to be explored. Recently, since the generative pre-trained language
models (PLMs) typically trained in natural language (NL)-to-NL paradigm have
been proven effective for low-resource generation, e.g., T5 and …

arxiv generation knowledge

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US