all AI news
Analysis of the rate of convergence of an over-parametrized convolutional neural network image classifier learned by gradient descent
May 14, 2024, 4:43 a.m. | Michael Kohler, Adam Krzyzak, Benjamin Walter
cs.LG updates on arXiv.org arxiv.org
Abstract: Image classification based on over-parametrized convolutional neural networks with a global average-pooling layer is considered. The weights of the network are learned by gradient descent. A bound on the rate of convergence of the difference between the misclassification risk of the newly introduced convolutional neural network estimate and the minimal possible value is derived.
abstract analysis arxiv classification classifier convergence convolutional convolutional neural network convolutional neural networks cs.lg difference global gradient image layer network networks neural network neural networks pooling rate stat.ml type
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Senior Machine Learning Engineer
@ GPTZero | Toronto, Canada
ML/AI Engineer / NLP Expert - Custom LLM Development (x/f/m)
@ HelloBetter | Remote
Doctoral Researcher (m/f/div) in Automated Processing of Bioimages
@ Leibniz Institute for Natural Product Research and Infection Biology (Leibniz-HKI) | Jena
Seeking Developers and Engineers for AI T-Shirt Generator Project
@ Chevon Hicks | Remote
Principal Data Architect - Azure & Big Data
@ MGM Resorts International | Home Office - US, NV
GN SONG MT Market Research Data Analyst 11
@ Accenture | Bengaluru, BDC7A