July 27, 2022, 1:11 a.m. | Minh Nguyen, Nicolas Wu

cs.LG updates on arXiv.org arxiv.org

Neural networks are typically represented as data structures that are
traversed either through iteration or by manual chaining of method calls.
However, a deeper analysis reveals that structured recursion can be used
instead, so that traversal is directed by the structure of the network itself.
This paper shows how such an approach can be realised in Haskell, by encoding
neural networks as recursive data types, and then their training as recursion
scheme patterns. In turn, we promote a coherent implementation …

arxiv networks neural networks pl

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Global Data Architect, AVP - State Street Global Advisors

@ State Street | Boston, Massachusetts

Data Engineer

@ NTT DATA | Pune, MH, IN