all AI news
Graph Expansion in Pruned Recurrent Neural Network Layers Preserve Performance
March 19, 2024, 4:41 a.m. | Suryam Arnav Kalra, Arindam Biswas, Pabitra Mitra, Biswajit Basu
cs.LG updates on arXiv.org arxiv.org
Abstract: Expansion property of a graph refers to its strong connectivity as well as sparseness. It has been reported that deep neural networks can be pruned to a high degree of sparsity while maintaining their performance. Such pruning is essential for performing real time sequence learning tasks using recurrent neural networks in resource constrained platforms. We prune recurrent networks such as RNNs and LSTMs, maintaining a large spectral gap of the underlying graphs and ensuring their …
abstract arxiv connectivity cs.cv cs.lg cs.ne expansion graph network networks neural network neural networks performance property pruning recurrent neural network sparsity type
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Associate Data Engineer
@ Nominet | Oxford/ Hybrid, GB
Data Science Senior Associate
@ JPMorgan Chase & Co. | Bengaluru, Karnataka, India