April 5, 2024, 4:43 a.m. | Meiling Tao, Xuechen Liang, Tianyu Shi, Lei Yu, Yiting Xie

cs.LG updates on arXiv.org arxiv.org

arXiv:2401.09432v2 Announce Type: replace-cross
Abstract: This study presents RoleCraft-GLM, an innovative framework aimed at enhancing personalized role-playing with Large Language Models (LLMs). RoleCraft-GLM addresses the key issue of lacking personalized interactions in conversational AI, and offers a solution with detailed and emotionally nuanced character portrayals. We contribute a unique conversational dataset that shifts from conventional celebrity-centric characters to diverse, non-celebrity personas, thus enhancing the realism and complexity of language modeling interactions. Additionally, our approach includes meticulous character development, ensuring dialogues …

abstract arxiv conversational conversational ai cs.ai cs.cl cs.lg dataset framework interactions issue key language language models large language large language models llms personalized playing role solution study the key type

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Lead Data Modeler

@ Sherwin-Williams | Cleveland, OH, United States