Aug. 29, 2023, 8:45 p.m. | /u/win10240

Neural Networks, Deep Learning and Machine Learning www.reddit.com

Surely something like this has been tried, but here’s the setup in my head. Tell me if it’s crazy or what you think.

Given input vector X do a hidden layer but instead of an activation function pair up neighboring dimensions of the hidden layer vector result and rotate them about the origin in 2d. This would give some kind of nonlinearity surely? The amount they are rotated can be selected by a trainable variable. Of course this requires your …

function head hidden neuralnetworks rotation setup something them think vector

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US