all AI news
Toward TransfORmers: Revolutionizing the Solution of Mixed Integer Programs with Transformers
Feb. 22, 2024, 5:42 a.m. | Joshua F. Cooper, Seung Jin Choi, I. Esra Buyuktahtakin
cs.LG updates on arXiv.org arxiv.org
Abstract: In this study, we introduce an innovative deep learning framework that employs a transformer model to address the challenges of mixed-integer programs, specifically focusing on the Capacitated Lot Sizing Problem (CLSP). Our approach, to our knowledge, is the first to utilize transformers to predict the binary variables of a mixed-integer programming (MIP) problem. Specifically, our approach harnesses the encoder decoder transformer's ability to process sequential data, making it well-suited for predicting binary variables indicating production …
abstract arxiv challenges cs.ai cs.lg deep learning deep learning framework framework knowledge math.co math.oc mixed solution stat.ml study transformer transformer model transformers type
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne