Nov. 14, 2022, 2:14 a.m. | Chengpeng Chen, Zichao Guo, Haien Zeng, Pengfei Xiong, Jian Dong

cs.CV updates on arXiv.org arxiv.org

Feature reuse has been a key technique in light-weight convolutional neural
networks (CNNs) design. Current methods usually utilize a concatenation
operator to keep large channel numbers cheaply (thus large network capacity) by
reusing feature maps from other layers. Although concatenation is parameters-
and FLOPs-free, its computational cost on hardware devices is non-negligible.
To address this, this paper provides a new perspective to realize feature reuse
via structural re-parameterization technique. A novel hardware-efficient
RepGhost module is proposed for implicit feature reuse …

arxiv ghost hardware

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne