all AI news
Edge-enhanced Feature Distillation Network for Efficient Super-Resolution. (arXiv:2204.08759v2 [cs.CV] UPDATED)
Web: http://arxiv.org/abs/2204.08759
June 23, 2022, 1:13 a.m. | Yan Wang
cs.CV updates on arXiv.org arxiv.org
With the recently massive development in convolution neural networks,
numerous lightweight CNN-based image super-resolution methods have been
proposed for practical deployments on edge devices. However, most existing
methods focus on one specific aspect: network or loss design, which leads to
the difficulty of minimizing the model size. To address the issue, we conclude
block devising, architecture searching, and loss design to obtain a more
efficient SR structure. In this paper, we proposed an edge-enhanced feature
distillation network, named EFDN, to …
More from arxiv.org / cs.CV updates on arXiv.org
Latest AI/ML/Big Data Jobs
Machine Learning Researcher - Saalfeld Lab
@ Howard Hughes Medical Institute - Chevy Chase, MD | Ashburn, Virginia
Project Director, Machine Learning in US Health
@ ideas42.org | Remote, US
Data Science Intern
@ NannyML | Remote
Machine Learning Engineer NLP/Speech
@ Play.ht | Remote
Research Scientist, 3D Reconstruction
@ Yembo | Remote, US
Clinical Assistant or Associate Professor of Management Science and Systems
@ University at Buffalo | Buffalo, NY