all AI news
Neural Networks with A La Carte Selection of Activation Functions. (arXiv:2206.12166v1 [cs.NE])
June 27, 2022, 1:10 a.m. | Moshe Sipper
cs.LG updates on arXiv.org arxiv.org
Activation functions (AFs), which are pivotal to the success (or failure) of
a neural network, have received increased attention in recent years, with
researchers seeking to design novel AFs that improve some aspect of network
performance. In this paper we take another direction, wherein we combine a slew
of known AFs into successful architectures, proposing three methods to do so
beneficially: 1) generate AF architectures at random, 2) use Optuna, an
automatic hyper-parameter optimization software framework, with a
Tree-structured Parzen …
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Senior Software Engineer, Generative AI (C++)
@ SoundHound Inc. | Toronto, Canada