Nov. 21, 2022, 2:15 a.m. | Jinrui Yang, Sheilla Njoto, Marc Cheong, Leah Ruppanner, Lea Frermann

cs.CL updates on arXiv.org arxiv.org

Gender discrimination in hiring is a pertinent and persistent bias in
society, and a common motivating example for exploring bias in NLP. However,
the manifestation of gendered language in application materials has received
limited attention. This paper investigates the framing of skills and background
in CVs of self-identified men and women. We introduce a data set of 1.8K
authentic, English-language, CVs from the US, covering 16 occupations, allowing
us to partially control for the confound occupation-specific gender base rates.
We …

arxiv case case study cvs english gender information power presentation study

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Senior Software Engineer, Generative AI (C++)

@ SoundHound Inc. | Toronto, Canada