April 23, 2024, 4:44 a.m. | Moritz Imfeld (ETH Zurich), Jacopo Graldi (ETH Zurich), Marco Giordano (ETH Zurich), Thomas Hofmann (ETH Zurich), Sotiris Anagnostidis (ETH Zurich), S

cs.LG updates on arXiv.org arxiv.org

arXiv:2310.05719v3 Announce Type: replace
Abstract: Fusion is a technique for merging multiple independently-trained neural networks in order to combine their capabilities. Past attempts have been restricted to the case of fully-connected, convolutional, and residual networks. This paper presents a systematic approach for fusing two or more transformer-based networks exploiting Optimal Transport to (soft-)align the various architectural components. We flesh out an abstraction for layer alignment, that can generalize to arbitrary architectures - in principle - and we apply this to …

arxiv cs.lg fusion stat.ml transformer transport type

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne