Feb. 9, 2024, 5:44 a.m. | Shuo Wang Keke Gai Jing Yu Liehuang Zhu Kim-Kwang Raymond Choo Bin Xiao

cs.LG updates on arXiv.org arxiv.org

Vertical federated learning has garnered significant attention as it allows clients to train machine learning models collaboratively without sharing local data, which protects the client's local private data. However, existing VFL methods face challenges when dealing with heterogeneous local models among participants, which affects optimization convergence and generalization. To address this challenge, this paper proposes a novel approach called Vertical federated learning for training multiple Heterogeneous models (VFedMH). VFedMH focuses on aggregating the local embeddings of each participant's knowledge during …

attention challenges client convergence cs.ai cs.dc cs.lg data face federated learning machine machine learning machine learning models multiple optimization private data train training

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Data Engineer

@ Kaseya | Bengaluru, Karnataka, India