Feb. 6, 2024, 5:53 a.m. | Gourab Dey Adithya V Ganesan Yash Kumar Lal Manal Shah Shreyashee Sinha Matthew Matero Salvatore Giorg

cs.CL updates on arXiv.org arxiv.org

Social science NLP tasks, such as emotion or humor detection, are required to capture the semantics along with the implicit pragmatics from text, often with limited amounts of training data. Instruction tuning has been shown to improve the many capabilities of large language models (LLMs) such as commonsense reasoning, reading comprehension, and computer programming. However, little is known about the effectiveness of instruction tuning on the social domain where implicit pragmatic cues are often needed to be captured. We explore …

capabilities cs.cl data detection emotion humor instruction-tuned language language models large language large language models llama llms nlp reading reasoning science semantics social social science tasks text training training data

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Senior Machine Learning Engineer

@ Samsara | Canada - Remote