Feb. 29, 2024, 5:41 a.m. | Ha Min Son, Moon Hyun Kim, Tai-Myoung Chung, Chao Huang, Xin Liu

cs.LG updates on arXiv.org arxiv.org

arXiv:2402.18372v1 Announce Type: new
Abstract: Federated learning is a promising framework to train neural networks with widely distributed data. However, performance degrades heavily with heterogeneously distributed data. Recent work has shown this is due to the final layer of the network being most prone to local bias, some finding success freezing the final layer as an orthogonal classifier. We investigate the training dynamics of the classifier by applying SVD to the weights motivated by the observation that freezing weights results …

abstract arxiv bias cs.ai cs.dc cs.lg data distributed distributed data federated learning framework layer network networks neural networks performance success train type variance work

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Senior Machine Learning Engineer

@ Samsara | Canada - Remote