Web: http://arxiv.org/abs/2105.05231

Jan. 28, 2022, 2:10 a.m. | Animesh Sakorikar, Lele Wang

stat.ML updates on arXiv.org arxiv.org

Gradient coding is a coding theoretic framework to provide robustness against
slow or unresponsive machines, known as stragglers, in distributed machine
learning applications. Recently, Kadhe et al. proposed a gradient code based on
a combinatorial design, called balanced incomplete block design (BIBD), which
is shown to outperform many existing gradient codes in worst-case adversarial
straggling scenarios. However, parameters for which such BIBD constructions
exist are very limited. In this paper, we aim to overcome such limitations and
construct gradient codes …

arxiv gradient product

Data Analytics and Technical support Lead

@ Coupa Software, Inc. | Bogota, Colombia

Data Science Manager

@ Vectra | San Jose, CA

Data Analyst Sr

@ Capco | Brazil - Sao Paulo

Data Scientist (NLP)

@ Builder.ai | London, England, United Kingdom - Remote

Senior Data Analyst

@ BuildZoom | Scottsdale, AZ/ San Francisco, CA/ Remote

Senior Research Scientist, Speech Recognition

@ SoundHound Inc. | Toronto, Canada