all AI news
Multi-scale Attention Network for Single Image Super-Resolution. (arXiv:2209.14145v2 [eess.IV] UPDATED)
Sept. 30, 2022, 1:16 a.m. | Yan Wang, Yusen Li, Gang Wang, Xiaoguang Liu
cs.CV updates on arXiv.org arxiv.org
By exploiting large kernel decomposition and attention mechanisms,
convolutional neural networks (CNN) can compete with transformer-based methods
in many high-level computer vision tasks. However, due to the advantage of
long-range modeling, the transformers with self-attention still dominate the
low-level vision, including the super-resolution task. In this paper, we
propose a CNN-based multi-scale attention network (MAN), which consists of
multi-scale large kernel attention (MLKA) and a gated spatial attention unit
(GSAU), to improve the performance of convolutional SR networks. Within our …
More from arxiv.org / cs.CV updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Data Management Assistant
@ World Vision | Amman Office, Jordan
Cloud Data Engineer, Global Services Delivery, Google Cloud
@ Google | Buenos Aires, Argentina