Aug. 19, 2022, 1:11 a.m. | Jonathan Vasquez, Xavier Gitiaux, Huzefa Rangwala

cs.LG updates on arXiv.org arxiv.org

Data owners face increasing liability for how the use of their data could
harm under-priviliged communities. Stakeholders would like to identify the
characteristics of data that lead to algorithms being biased against any
particular demographic groups, for example, defined by their race, gender, age,
and/or religion. Specifically, we are interested in identifying subsets of the
feature space where the ground truth response function from features to
observed outcomes differs across demographic groups. To this end, we propose
FORESEE, a FORESt …

arxiv dataset discrimination lg

Senior Marketing Data Analyst

@ Amazon.com | Amsterdam, North Holland, NLD

Senior Data Analyst

@ MoneyLion | Kuala Lumpur, Kuala Lumpur, Malaysia

Data Management Specialist - Office of the CDO - Chase- Associate

@ JPMorgan Chase & Co. | LONDON, LONDON, United Kingdom

BI Data Analyst

@ Nedbank | Johannesburg, ZA

Head of Data Science and Artificial Intelligence (m/f/d)

@ Project A Ventures | Munich, Germany

Senior Data Scientist - GenAI

@ Roche | Hyderabad RSS