all AI news
Building Activation Functions for Deep Networks
Towards AI - Medium pub.towardsai.net
A basic component of a deep neural network is the activation function (AF) — a non-linear function that shapes the final output of a node (“neuron”) in the network. Common activation functions include sigmoid, hyperbolic tangent (tanh), and rectified linear unit (ReLU).
More often than not, network builders will come up with new learning algorithms, architectures, and such, while continuing to use standard AFs.
In two recent works I focused on the AFs, asking what …
activation-functions ai-generated image building craiyon deep learning deep-networks deep neural network evolutionary algorithms function functions generated genetic programming image linear network networks neural network neuron node non-linear relu sigmoid