all AI news
SIRe-Networks: Convolutional Neural Networks Architectural Extension for Information Preservation via Skip/Residual Connections and Interlaced Auto-Encoders. (arXiv:2110.02776v2 [cs.CV] UPDATED)
cs.CV updates on arXiv.org arxiv.org
Improving existing neural network architectures can involve several design
choices such as manipulating the loss functions, employing a diverse learning
strategy, exploiting gradient evolution at training time, optimizing the
network hyper-parameters, or increasing the architecture depth. The latter
approach is a straightforward solution, since it directly enhances the
representation capabilities of a network; however, the increased depth
generally incurs in the well-known vanishing gradient problem. In this paper,
borrowing from different methods addressing this issue, we introduce an
interlaced multi-task …
arxiv convolutional neural networks extension information networks neural networks preservation