all AI news
The Impact of Differential Feature Under-reporting on Algorithmic Fairness
May 6, 2024, 4:43 a.m. | Nil-Jana Akpinar, Zachary C. Lipton, Alexandra Chouldechova
cs.LG updates on arXiv.org arxiv.org
Abstract: Predictive risk models in the public sector are commonly developed using administrative data that is more complete for subpopulations that more greatly rely on public services. In the United States, for instance, information on health care utilization is routinely available to government agencies for individuals supported by Medicaid and Medicare, but not for the privately insured. Critiques of public sector algorithms have identified such differential feature under-reporting as a driver of disparities in algorithmic decision-making. …
abstract algorithmic fairness arxiv cs.cy cs.lg data differential fairness feature government government agencies health health care impact information instance predictive public public sector reporting risk sector services stat.ml type united united states
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Software Engineer for AI Training Data (School Specific)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Python)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Tier 2)
@ G2i Inc | Remote
Data Engineer
@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US