July 22, 2023, 12:12 a.m. | Moshe Sipper, Ph.D.

Towards AI - Medium pub.towardsai.net

AI-generated image (craiyon)

A basic component of a deep neural network is the activation function (AF) — a non-linear function that shapes the final output of a node (“neuron”) in the network. Common activation functions include sigmoid, hyperbolic tangent (tanh), and rectified linear unit (ReLU).

More often than not, network builders will come up with new learning algorithms, architectures, and such, while continuing to use standard AFs.

In two recent works I focused on the AFs, asking what …

activation-functions ai-generated image building craiyon deep learning deep-networks deep neural network evolutionary algorithms function functions generated genetic programming image linear network networks neural network neuron node non-linear relu sigmoid

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US