Nov. 14, 2022, 2:12 a.m. | Vincent Jeanselme, Maria De-Arteaga, Zhe Zhang, Jessica Barrett, Brian Tom

cs.LG updates on arXiv.org arxiv.org

Biases have marked medical history, leading to unequal care affecting
marginalised groups. The patterns of missingness in observational data often
reflect these group discrepancies, but the algorithmic fairness implications of
group-specific missingness are not well understood. Despite its potential
impact, imputation is too often an overlooked preprocessing step. When
explicitly considered, attention is placed on overall performance, ignoring how
this preprocessing can reinforce group-specific inequities. Our work questions
this choice by studying how imputation affects downstream algorithmic fairness.
First, we …

algorithmic fairness arxiv fairness impact imputation strategies

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Field Sample Specialist (Air Sampling) - Eurofins Environment Testing – Pueblo, CO

@ Eurofins | Pueblo, CO, United States

Camera Perception Engineer

@ Meta | Sunnyvale, CA