all AI news
Stability and Performance Analysis of Discrete-Time ReLU Recurrent Neural Networks
May 9, 2024, 4:42 a.m. | Sahel Vahedi Noori, Bin Hu, Geir Dullerud, Peter Seiler
cs.LG updates on arXiv.org arxiv.org
Abstract: This paper presents sufficient conditions for the stability and $\ell_2$-gain performance of recurrent neural networks (RNNs) with ReLU activation functions. These conditions are derived by combining Lyapunov/dissipativity theory with Quadratic Constraints (QCs) satisfied by repeated ReLUs. We write a general class of QCs for repeated RELUs using known properties for the scalar ReLU. Our stability and performance condition uses these QCs along with a "lifted" representation for the ReLU RNN. We show that the positive …
abstract analysis arxiv class constraints cs.lg cs.sy eess.sy functions general math.oc networks neural networks paper performance performance analysis recurrent neural networks relu stability theory type
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Software Engineer for AI Training Data (School Specific)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Python)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Tier 2)
@ G2i Inc | Remote
Data Engineer
@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US