all AI news
Poolingformer: Long Document Modeling with Pooling Attention. (arXiv:2105.04371v2 [cs.CL] UPDATED)
Oct. 25, 2022, 1:18 a.m. | Hang Zhang, Yeyun Gong, Yelong Shen, Weisheng Li, Jiancheng Lv, Nan Duan, Weizhu Chen
cs.CL updates on arXiv.org arxiv.org
In this paper, we introduce a two-level attention schema, Poolingformer, for
long document modeling. Its first level uses a smaller sliding window pattern
to aggregate information from neighbors. Its second level employs a larger
window to increase receptive fields with pooling attention to reduce both
computational cost and memory consumption. We first evaluate Poolingformer on
two long sequence QA tasks: the monolingual NQ and the multilingual TyDi QA.
Experimental results show that Poolingformer sits atop three official
leaderboards measured by …
More from arxiv.org / cs.CL updates on arXiv.org
Jobs in AI, ML, Big Data
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Business Data Analyst
@ Alstom | Johannesburg, GT, ZA