Sept. 26, 2022, 1:12 a.m. | Ravin Kumar

cs.LG updates on arXiv.org arxiv.org

Activation Functions introduce non-linearity in the deep neural networks.
This nonlinearity helps the neural networks learn faster and efficiently from
the dataset. In deep learning, many activation functions are developed and used
based on the type of problem statement. ReLU's variants, SWISH, and MISH are
goto activation functions. MISH function is considered having similar or even
better performance than SWISH, and much better than ReLU. In this paper, we
propose an activation function named APTx which behaves similar to MISH, …

arxiv deep learning function relu swish variants

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Data Engineer

@ Parker | New York City

Sr. Data Analyst | Home Solutions

@ Three Ships | Raleigh or Charlotte, NC