May 15, 2023, 12:42 a.m. | Marianne Defresne, Sophie Barbe, Thomas Schiex

stat.ML updates on arXiv.org arxiv.org

In the ongoing quest for hybridizing discrete reasoning with neural nets,
there is an increasing interest in neural architectures that can learn how to
solve discrete reasoning or optimization problems from natural inputs. In this
paper, we introduce a scalable neural architecture and loss function dedicated
to learning the constraints and criteria of NP-hard reasoning problems
expressed as discrete Graphical Models. Our loss function solves one of the
main limitations of Besag's pseudo-loglikelihood, enabling learning of high
energies. We empirically …

architecture architectures arxiv constraints deep learning function learn loss natural neural architectures neural nets optimization paper reasoning scalable

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Business Data Scientist, gTech Ads

@ Google | Mexico City, CDMX, Mexico

Lead, Data Analytics Operations

@ Zocdoc | Pune, Maharashtra, India