Web: http://arxiv.org/abs/2204.08759

June 23, 2022, 1:13 a.m. | Yan Wang

cs.CV updates on arXiv.org arxiv.org

With the recently massive development in convolution neural networks,
numerous lightweight CNN-based image super-resolution methods have been
proposed for practical deployments on edge devices. However, most existing
methods focus on one specific aspect: network or loss design, which leads to
the difficulty of minimizing the model size. To address the issue, we conclude
block devising, architecture searching, and loss design to obtain a more
efficient SR structure. In this paper, we proposed an edge-enhanced feature
distillation network, named EFDN, to …

arxiv cv distillation edge feature network

More from arxiv.org / cs.CV updates on arXiv.org

Machine Learning Researcher - Saalfeld Lab

@ Howard Hughes Medical Institute - Chevy Chase, MD | Ashburn, Virginia

Project Director, Machine Learning in US Health

@ ideas42.org | Remote, US

Data Science Intern

@ NannyML | Remote

Machine Learning Engineer NLP/Speech

@ Play.ht | Remote

Research Scientist, 3D Reconstruction

@ Yembo | Remote, US

Clinical Assistant or Associate Professor of Management Science and Systems

@ University at Buffalo | Buffalo, NY