all AI news
Emotion-Conditioned Text Generation through Automatic Prompt Optimization. (arXiv:2308.04857v1 [cs.CL])
cs.CL updates on arXiv.org arxiv.org
Conditional natural language generation methods often require either
expensive fine-tuning or training a large language model from scratch. Both are
unlikely to lead to good results without a substantial amount of data and
computational resources. Prompt learning without changing the parameters of a
large language model presents a promising alternative. It is a cost-effective
approach, while still achieving competitive results. While this procedure is
now established for zero- and few-shot text classification and structured
prediction, it has received limited attention …
arxiv computational data emotion fine-tuning good language language generation language model large language large language model natural natural language natural language generation optimization prompt prompt learning resources text text generation through training