Aug. 29, 2023, 8:45 p.m. | /u/win10240

Neural Networks, Deep Learning and Machine Learning www.reddit.com

Surely something like this has been tried, but here’s the setup in my head. Tell me if it’s crazy or what you think.

Given input vector X do a hidden layer but instead of an activation function pair up neighboring dimensions of the hidden layer vector result and rotate them about the origin in 2d. This would give some kind of nonlinearity surely? The amount they are rotated can be selected by a trainable variable. Of course this requires your …

function head hidden neuralnetworks rotation setup something them think vector

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Principal Data Engineering Manager

@ Microsoft | Redmond, Washington, United States

Machine Learning Engineer

@ Apple | San Diego, California, United States