Feb. 25, 2024, 8:46 a.m. | /u/Flankierengeschichte

Machine Learning www.reddit.com

Deep neural networks work under the manifold hypothesis, that is, that the training data lie on a lower-dimensional manifold. In particular, the last layer is a linear layer, which, in order to work properly, must mean that the target manifold curvature must be about constant since a consistent curvature is needed for a linear subspace to approximate the target manifold (in case of regression) or separate it (in case of classification). That is, the sequence of manifolds generated by gradient …

consistent data flow gradient hypothesis layer linear machinelearning manifold mean networks neural networks training training data work

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Reporting & Data Analytics Lead (Sizewell C)

@ EDF | London, GB

Data Analyst

@ Notable | San Mateo, CA