April 2, 2024, 7:42 p.m. | Wei Cui, Wei Yu

cs.LG updates on arXiv.org arxiv.org

arXiv:2404.00505v1 Announce Type: new
Abstract: In most applications of utilizing neural networks for mathematical optimization, a dedicated model is trained for each specific optimization objective. However, in many scenarios, several distinct yet correlated objectives or tasks often need to be optimized on the same set of problem inputs. Instead of independently training a different neural network for each problem separately, it would be more efficient to exploit the correlations between these objectives and to train multiple neural network models with …

abstract applications arxiv cs.ai cs.lg cs.ni however inputs loss networks neural networks optimization set stat.ml tasks training transfer transfer learning type

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Codec Avatars Research Engineer

@ Meta | Pittsburgh, PA