Feb. 14, 2024, 5:43 a.m. | Andrew Hundt Julia Schuller Severin Kacianka

cs.LG updates on arXiv.org arxiv.org

Machine Learning (ML) and 'Artificial Intelligence' ('AI') methods tend to replicate and amplify existing biases and prejudices, as do Robots with AI. For example, robots with facial recognition have failed to identify Black Women as human, while others have categorized people, such as Black Men, as criminals based on appearance alone. A 'culture of modularity' means harms are perceived as 'out of scope', or someone else's responsibility, throughout employment positions in the 'AI supply chain'. Incidents are routine enough (incidentdatabase.ai …

agile ai and robotics amplify artificial artificial intelligence biases criminals cs.ai cs.cy cs.lg cs.ro cs.se development example facial recognition human identify intelligence machine machine learning men people recognition replicate research research and development robotics robots women

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Business Data Analyst

@ Alstom | Johannesburg, GT, ZA