all AI news
Do deep neural networks utilize the weight space efficiently?. (arXiv:2401.16438v1 [cs.LG])
cs.LG updates on arXiv.org arxiv.org
Deep learning models like Transformers and Convolutional Neural Networks
(CNNs) have revolutionized various domains, but their parameter-intensive
nature hampers deployment in resource-constrained settings. In this paper, we
introduce a novel concept utilizes column space and row space of weight
matrices, which allows for a substantial reduction in model parameters without
compromising performance. Leveraging this paradigm, we achieve
parameter-efficient deep learning models.. Our approach applies to both
Bottleneck and Attention layers, effectively halving the parameters while
incurring only minor performance degradation. …
arxiv cnns column concept convolutional neural networks cs.lg deep learning deployment domains nature networks neural networks novel paper parameters space transformers