all AI news
Text Understanding and Generation Using Transformer Models for Intelligent E-commerce Recommendations
Feb. 27, 2024, 5:49 a.m. | Yafei Xiang, Hanyi Yu, Yulu Gong, Shuning Huo, Mengran Zhu
cs.CL updates on arXiv.org arxiv.org
Abstract: With the rapid development of artificial intelligence technology, Transformer structural pre-training model has become an important tool for large language model (LLM) tasks. In the field of e-commerce, these models are especially widely used, from text understanding to generating recommendation systems, which provide powerful technical support for improving user experience and optimizing service processes. This paper reviews the core application scenarios of Transformer pre-training model in e-commerce text understanding and recommendation generation, including but not …
abstract artificial artificial intelligence arxiv become commerce cs.ai cs.cl development e-commerce intelligence intelligent language language model large language large language model llm pre-training recommendation recommendations recommendation systems systems tasks technology text text understanding tool training transformer transformer models type understanding
More from arxiv.org / cs.CL updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Data Engineer - New Graduate
@ Applied Materials | Milan,ITA
Lead Machine Learning Scientist
@ Biogen | Cambridge, MA, United States