June 27, 2022, 1:11 a.m. | Vrishabh Patil, Yonatan Mintz

cs.LG updates on arXiv.org arxiv.org

Artificial Neural Networks (ANNs) are prevalent machine learning models that
are applied across various real-world classification tasks. However, training
ANNs is time-consuming and the resulting models take a lot of memory to deploy.
In order to train more parsimonious ANNs, we propose a novel mixed-integer
programming (MIP) formulation for training fully-connected ANNs. Our
formulations can account for both binary and rectified linear unit (ReLU)
activations, and for the use of a log-likelihood loss. We present numerical
experiments comparing our MIP-based …

arxiv lg mixed networks neural networks programming training

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Senior AI & Data Engineer

@ Bertelsmann | Kuala Lumpur, 14, MY, 50400

Analytics Engineer

@ Reverse Tech | Philippines - Remote