April 23, 2024, 4:44 a.m. | Daniel Coquelin, Katharina Fl\"ugel, Marie Weiel, Nicholas Kiefer, Charlotte Debus, Achim Streit, Markus G\"otz

cs.LG updates on arXiv.org arxiv.org

arXiv:2401.08505v2 Announce Type: replace
Abstract: This study explores the learning dynamics of neural networks by analyzing the singular value decomposition (SVD) of their weights throughout training. Our investigation reveals that an orthogonal basis within each multidimensional weight's SVD representation stabilizes during training. Building upon this, we introduce Orthogonality-Informed Adaptive Low-Rank (OIALR) training, a novel training method exploiting the intrinsic orthogonality of neural networks. OIALR seamlessly integrates into existing training workflows with minimal accuracy loss, as demonstrated by benchmarking on various …

abstract arxiv building cs.ai cs.lg dynamics investigation low multidimensional networks neural networks representation singular study svd train training type value

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US