April 22, 2024, 4:42 a.m. | Edward A. Small, Kacper Sokol, Daniel Manning, Flora D. Salim, Jeffrey Chan

cs.LG updates on arXiv.org arxiv.org

arXiv:2304.09779v3 Announce Type: replace
Abstract: Group fairness is achieved by equalising prediction distributions between protected sub-populations; individual fairness requires treating similar individuals alike. These two objectives, however, are incompatible when a scoring model is calibrated through discontinuous probability functions, where individuals can be randomly assigned an outcome determined by a fixed probability. This procedure may provide two similar individuals from the same protected group with classification odds that are disparately different -- a clear violation of individual fairness. Assigning unique …

abstract arxiv cs.cy cs.lg equal fairness functions however math.oc math.pr post-processing prediction probability processing scoring through type

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US