Feb. 22, 2024, 5:42 a.m. | Joshua F. Cooper, Seung Jin Choi, I. Esra Buyuktahtakin

cs.LG updates on arXiv.org arxiv.org

arXiv:2402.13380v1 Announce Type: cross
Abstract: In this study, we introduce an innovative deep learning framework that employs a transformer model to address the challenges of mixed-integer programs, specifically focusing on the Capacitated Lot Sizing Problem (CLSP). Our approach, to our knowledge, is the first to utilize transformers to predict the binary variables of a mixed-integer programming (MIP) problem. Specifically, our approach harnesses the encoder decoder transformer's ability to process sequential data, making it well-suited for predicting binary variables indicating production …

abstract arxiv challenges cs.ai cs.lg deep learning deep learning framework framework knowledge math.co math.oc mixed solution stat.ml study transformer transformer model transformers type

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne