April 11, 2024, 4:42 a.m. | Dedi Wang, Yihang Wang, Luke Evans, Pratyush Tiwary

cs.LG updates on arXiv.org arxiv.org

arXiv:2209.00905v4 Announce Type: replace
Abstract: While representation learning has been central to the rise of machine learning and artificial intelligence, a key problem remains in making the learned representations meaningful. For this, the typical approach is to regularize the learned representation through prior probability distributions. However, such priors are usually unavailable or are ad hoc. To deal with this, recent efforts have shifted towards leveraging the insights from physical principles to guide the learning process. In this spirit, we propose …

abstract artificial artificial intelligence arxiv cond-mat.dis-nn cond-mat.stat-mech cs.lg dynamics however intelligence key machine machine learning making physics.chem-ph physics.comp-ph prior probability representation representation learning through type

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne