all AI news
APTx: better activation function than MISH, SWISH, and ReLU's variants used in deep learning. (arXiv:2209.06119v3 [cs.LG] UPDATED)
Sept. 26, 2022, 1:14 a.m. | Ravin Kumar
cs.CV updates on arXiv.org arxiv.org
Activation Functions introduce non-linearity in the deep neural networks.
This nonlinearity helps the neural networks learn faster and efficiently from
the dataset. In deep learning, many activation functions are developed and used
based on the type of problem statement. ReLU's variants, SWISH, and MISH are
goto activation functions. MISH function is considered having similar or even
better performance than SWISH, and much better than ReLU. In this paper, we
propose an activation function named APTx which behaves similar to MISH, …
More from arxiv.org / cs.CV updates on arXiv.org
Jobs in AI, ML, Big Data
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Senior Engineer - Data Science Operations
@ causaLens | London - Hybrid, England, United Kingdom
F0138 - LLM Developer (AI NLP)
@ Ubiquiti Inc. | Taipei
Staff Engineer, Database
@ Nagarro | Gurugram, India
Artificial Intelligence Assurance Analyst
@ Booz Allen Hamilton | USA, VA, McLean (8251 Greensboro Dr)