all AI news
Optimizing Performance of Feedforward and Convolutional Neural Networks through Dynamic Activation Functions
Feb. 20, 2024, 5:44 a.m. | Chinmay Rane, Kanishka Tyagi, Michael Manry
cs.LG updates on arXiv.org arxiv.org
Abstract: Deep learning training training algorithms are a huge success in recent years in many fields including speech, text,image video etc. Deeper and deeper layers are proposed with huge success with resnet structures having around 152 layers. Shallow convolution neural networks(CNN's) are still an active research, where some phenomena are still unexplained. Activation functions used in the network are of utmost importance, as they provide non linearity to the networks. Relu's are the most commonly used …
abstract algorithms arxiv cnn convolution convolutional neural networks cs.ce cs.lg cs.ne deep learning deep learning training dynamic etc fields functions image networks neural networks performance resnet speech success text through training type video
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Robotics Technician - 3rd Shift
@ GXO Logistics | Perris, CA, US, 92571