all AI news
MemoNet:Memorizing Representations of All Cross Features Efficiently via Multi-Hash Codebook Network for CTR Prediction. (arXiv:2211.01334v1 [cs.IR])
Nov. 3, 2022, 1:12 a.m. | Pengtao Zhang, Junlin Zhang
cs.LG updates on arXiv.org arxiv.org
New findings in natural language processing(NLP) demonstrate that the strong
memorization capability contributes a lot to the success of large language
models.This inspires us to explicitly bring an independent memory mechanism
into CTR ranking model to learn and memorize all cross features'
representations.In this paper,we propose multi-Hash Codebook NETwork(HCNet) as
the memory mechanism for efficiently learning and memorizing representations of
all cross features in CTR tasks.HCNet uses multi-hash codebook as the main
memory place and the whole memory procedure consists …
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
C003549 Data Analyst (NS) - MON 13 May
@ EMW, Inc. | Braine-l'Alleud, Wallonia, Belgium
Marketing Decision Scientist
@ Meta | Menlo Park, CA | New York City