all AI news
Using Context-to-Vector with Graph Retrofitting to Improve Word Embeddings. (arXiv:2210.16848v1 [cs.CL])
Nov. 1, 2022, 1:16 a.m. | Jiangbin Zheng, Yile Wang, Ge Wang, Jun Xia, Yufei Huang, Guojiang Zhao, Yue Zhang, Stan Z. Li
cs.CL updates on arXiv.org arxiv.org
Although contextualized embeddings generated from large-scale pre-trained
models perform well in many tasks, traditional static embeddings (e.g.,
Skip-gram, Word2Vec) still play an important role in low-resource and
lightweight settings due to their low computational cost, ease of deployment,
and stability. In this paper, we aim to improve word embeddings by 1)
incorporating more contextual information from existing pre-trained models into
the Skip-gram framework, which we call Context-to-Vec; 2) proposing a
post-processing retrofitting method for static embeddings independent of
training by …
More from arxiv.org / cs.CL updates on arXiv.org
Jobs in AI, ML, Big Data
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Data Analyst (Digital Business Analyst)
@ Activate Interactive Pte Ltd | Singapore, Central Singapore, Singapore