all AI news
Mix-Pooling Strategy for Attention Mechanism. (arXiv:2208.10322v2 [cs.LG] UPDATED)
Oct. 25, 2022, 1:13 a.m. | Shanshan Zhong, Wushao Wen, Jinghui Qin
cs.LG updates on arXiv.org arxiv.org
Recently many effective attention modules are proposed to boot the model
performance by exploiting the internal information of convolutional neural
networks in computer vision. In general, many previous works ignore considering
the design of the pooling strategy of the attention mechanism since they adopt
the global average pooling for granted, which hinders the further improvement
of the performance of the attention mechanism. However, we empirically find and
verify a phenomenon that the simple linear combination of global max-pooling
and global …
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Business Data Scientist, gTech Ads
@ Google | Mexico City, CDMX, Mexico
Lead, Data Analytics Operations
@ Zocdoc | Pune, Maharashtra, India