all AI news
Transformer-based Program Synthesis for Low-Data Environments. (arXiv:2205.09246v1 [cs.PL])
May 20, 2022, 1:11 a.m. | Jack Roper
cs.LG updates on arXiv.org arxiv.org
Recent advancements in large pre-trained transformer models (GPT2/3, T5) have
found use in program synthesis to generate programs that satisfy a set of
input/output examples. However, these models perform poorly on long-horizon and
low-data tasks, and often don't seem to understand the semantics of the
languages they generate. We investigate an approach that tackles both of these
issues, by using attributed context-free-grammars of programming languages to
generate programs, and then analyzing generated programs so that they can be
annotated with …
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Reporting & Data Analytics Lead (Sizewell C)
@ EDF | London, GB
Data Analyst
@ Notable | San Mateo, CA