Aug. 10, 2023, 4:43 a.m. | Gokulprasath R

cs.LG updates on arXiv.org arxiv.org

Deep learning has revolutionized industries like computer vision, natural
language processing, and speech recognition. However, back propagation, the
main method for training deep neural networks, faces challenges like
computational overhead and vanishing gradients. In this paper, we propose a
novel instant parameter update methodology that eliminates the need for
computing gradients at each layer. Our approach accelerates learning, avoids
the vanishing gradient problem, and outperforms state-of-the-art methods on
benchmark data sets. This research presents a promising direction for efficient
and …

accuracy arxiv back propagation challenges computational computer computer vision deep learning industries language language processing natural natural language natural language processing network networks neural network neural networks novel paper processing propagation recognition speech speech recognition training vision

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Sr. Software Development Manager, AWS Neuron Machine Learning Distributed Training

@ Amazon.com | Cupertino, California, USA