all AI news
Decentralized EM to Learn Gaussian Mixtures from Datasets Distributed by Features. (arXiv:2201.09965v1 [cs.LG])
Web: http://arxiv.org/abs/2201.09965
Jan. 26, 2022, 2:10 a.m. | Pedro Valdeira, Cláudia Soares, João Xavier
cs.LG updates on arXiv.org arxiv.org
Expectation Maximization (EM) is the standard method to learn Gaussian
mixtures. Yet its classic, centralized form is often infeasible, due to privacy
concerns and computational and communication bottlenecks. Prior work dealt with
data distributed by examples, horizontal partitioning, but we lack a
counterpart for data scattered by features, an increasingly common scheme (e.g.
user profiling with data from multiple entities). To fill this gap, we provide
an EM-based algorithm to fit Gaussian mixtures to Vertically Partitioned data
(VP-EM). In federated …
More from arxiv.org / cs.LG updates on arXiv.org
Latest AI/ML/Big Data Jobs
Data Scientist
@ Fluent, LLC | Boca Raton, Florida, United States
Big Data ETL Engineer
@ Binance.US | Vancouver
Data Scientist / Data Engineer
@ Kin + Carta | Chicago
Data Engineer
@ Craft | Warsaw, Masovian Voivodeship, Poland
Senior Manager, Data Analytics Audit
@ Affirm | Remote US
Data Scientist - Nationwide Opportunities, AWS Professional Services
@ Amazon.com | US, NC, Virtual Location - N Carolina