March 25, 2024, 4:41 a.m. | Dejan Grubisic, Chris Cummins, Volker Seeker, Hugh Leather

cs.LG updates on arXiv.org arxiv.org

arXiv:2403.14714v1 Announce Type: cross
Abstract: We introduce a novel paradigm in compiler optimization powered by Large Language Models with compiler feedback to optimize the code size of LLVM assembly. The model takes unoptimized LLVM IR as input and produces optimized IR, the best optimization passes, and instruction counts of both unoptimized and optimized IRs. Then we compile the input with generated optimization passes and evaluate if the predicted instruction count is correct, generated IR is compilable, and corresponds to compiled …

abstract arxiv assembly code compiler cs.lg cs.pl feedback generated language language models large language large language models novel optimization paradigm type

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Senior ML Engineer

@ Carousell Group | Ho Chi Minh City, Vietnam

Data and Insight Analyst

@ Cotiviti | Remote, United States