Feb. 8, 2024, 5:46 a.m. | Julio C. Rangel Tarcisio Mendes de Farias Ana Claudia Sima Norio Kobayashi

cs.CL updates on arXiv.org arxiv.org

The recent success of Large Language Models (LLM) in a wide range of Natural Language Processing applications opens the path towards novel Question Answering Systems over Knowledge Graphs leveraging LLMs. However, one of the main obstacles preventing their implementation is the scarcity of training data for the task of translating questions into corresponding SPARQL queries, particularly in the case of domain-specific KGs. To overcome this challenge, in this study, we evaluate several strategies for fine-tuning the OpenLlama LLM for question …

analysis applications cs.ai cs.cl cs.db cs.ir fine-tuning graph graphs implementation knowledge knowledge graph knowledge graphs language language models language processing large language large language models life llm llms natural natural language natural language processing novel obstacles path processing question question answering science success systems training

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York