Aug. 29, 2023, 8:45 p.m. | /u/win10240

Neural Networks, Deep Learning and Machine Learning

Surely something like this has been tried, but here’s the setup in my head. Tell me if it’s crazy or what you think.

Given input vector X do a hidden layer but instead of an activation function pair up neighboring dimensions of the hidden layer vector result and rotate them about the origin in 2d. This would give some kind of nonlinearity surely? The amount they are rotated can be selected by a trainable variable. Of course this requires your …

function head hidden neuralnetworks rotation setup something them think vector

Senior Machine Learning Engineer

@ Kintsugi | remote

Staff Machine Learning Engineer (Tech Lead)

@ Kintsugi | Remote

R_00029290 Lead Data Modeler – Remote

@ University at Buffalo | Austin, TX

R_00029290 Lead Data Modeler – Remote

@ University of Texas at Austin | Austin, TX

Senior AI/ML Developer

@ | Remote

Senior Data Science Consultant

@ Sia Partners | Amsterdam, Netherlands