Web: http://arxiv.org/abs/2209.07124

Sept. 16, 2022, 1:11 a.m. | Elia Guerra, Francesc Wilhelmi, Marco Miozzo, Paolo Dini

cs.LG updates on arXiv.org arxiv.org

Federated learning (FL) is one of the most appealing alternatives to the
standard centralized learning paradigm, allowing heterogeneous set of devices
to train a machine learning model without sharing their raw data. However, FL
requires a central server to coordinate the learning process, thus introducing
potential scalability and security issues. In the literature, server-less FL
approaches like gossip federated learning (GFL) and blockchain-enabled
federated learning (BFL) have been proposed to mitigate these issues. In this
work, we propose a complete …

arxiv cost data data sources distributed distributed data machine machine learning machine learning model

More from arxiv.org / cs.LG updates on arXiv.org

Machine Learning Product Manager (Canada, Remote)

@ FreshBooks | Canada

Data Engineer

@ Amazon.com | Irvine, California, USA

Senior Autonomy Behavior II, Performance Assessment Engineer

@ Cruise LLC | San Francisco, CA

Senior Data Analytics Engineer

@ Intercom | Dublin, Ireland

Data Analyst Intern

@ ADDX | Singapore

Data Science Analyst - Consumer

@ Yelp | London, England, United Kingdom

Senior Data Analyst - Python+Hadoop

@ Capco | India - Bengaluru

DevOps Engineer, Data Team

@ SingleStore | Hyderabad, India

Software Engineer (Machine Learning, AI Platform)

@ Phaidra | Remote

Sr. UI/UX Designer - Artificial Intelligence (ID:1213)

@ Truelogic Software | Remote, anywhere in LATAM

Analytics Engineer

@ carwow | London, England, United Kingdom

HRIS Data Analyst

@ SecurityScorecard | Remote