Feb. 23, 2022, 2:12 a.m. | Thilo Hagendorff, Leonie Bossert, Tse Yip Fai, Peter Singer

cs.LG updates on arXiv.org arxiv.org

Massive efforts are made to reduce biases in both data and algorithms in
order to render AI applications fair. These efforts are propelled by various
high-profile cases where biased algorithmic decision-making caused harm to
women, people of color, minorities, etc. However, the AI fairness field still
succumbs to a blind spot, namely its insensitivity to discrimination against
animals. This paper is the first to describe the 'speciesist bias' and
investigate it in several different AI systems. Speciesist biases are learned …

ai ai applications animals applications arxiv bias bias in ai

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US