Jan. 31, 2024, 4:45 p.m. | Onur Can Koyun, Behçet Uğur Töreyin

cs.LG updates on arXiv.org arxiv.org

Deep learning models like Transformers and Convolutional Neural Networks
(CNNs) have revolutionized various domains, but their parameter-intensive
nature hampers deployment in resource-constrained settings. In this paper, we
introduce a novel concept utilizes column space and row space of weight
matrices, which allows for a substantial reduction in model parameters without
compromising performance. Leveraging this paradigm, we achieve
parameter-efficient deep learning models.. Our approach applies to both
Bottleneck and Attention layers, effectively halving the parameters while
incurring only minor performance degradation. …

arxiv cnns column concept convolutional neural networks cs.lg deep learning deployment domains nature networks neural networks novel paper parameters space transformers

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Senior Data Engineer

@ Quantexa | Sydney, New South Wales, Australia

Staff Analytics Engineer

@ Warner Bros. Discovery | NY New York 230 Park Avenue South