all AI news
MemoNet:Memorizing Representations of All Cross Features Efficiently via Multi-Hash Codebook Network for CTR Prediction. (arXiv:2211.01334v2 [cs.IR] UPDATED)
Nov. 4, 2022, 1:13 a.m. | Pengtao Zhang, Junlin Zhang
cs.LG updates on arXiv.org arxiv.org
New findings in natural language processing(NLP) demonstrate that the strong
memorization capability contributes a lot to the success of large language
models.This inspires us to explicitly bring an independent memory mechanism
into CTR ranking model to learn and memorize all cross
features'representations. In this paper,we propose multi-Hash Codebook
NETwork(HCNet) as the memory mechanism for efficiently learning and memorizing
representations of all cross features in CTR tasks.HCNet uses multi-hash
codebook as the main memory place and the whole memory procedure consists …
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Software Engineer, Data Tools - Full Stack
@ DoorDash | Pune, India
Senior Data Analyst
@ Artsy | New York City