Jan. 20, 2022, 2:11 a.m. | Huanle Zhang, Mi Zhang, Xin Liu, Prasant Mohapatra, Michael DeLucia

cs.LG updates on arXiv.org arxiv.org

Federated learning (FL) hyper-parameters significantly affect the training
overheads in terms of computation time, transmission time, computation load,
and transmission load. However, the current practice of manually selecting FL
hyper-parameters puts a high burden on FL practitioners since various
applications prefer different training preferences. In this paper, we propose
FedTune, an automatic FL hyper-parameter tuning algorithm tailored to
applications' diverse system requirements of FL training. FedTune is
lightweight and flexible, achieving 4.18%-22.48% improvement for different
datasets compared to fixed FL …

arxiv federated learning learning perspective

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Stagista Technical Data Engineer

@ Hager Group | BRESCIA, IT

Data Analytics - SAS, SQL - Associate

@ JPMorgan Chase & Co. | Mumbai, Maharashtra, India