March 22, 2024, 2:54 p.m. |

Mozilla Foundation Blog foundation.mozilla.org





Artificial intelligence tools can contain many of the same biases that humans do — whether it be search engines, dating apps, or even job hiring software. The problem also exists in systems with even more dire consequences — specifically, the criminal justice system.

Facial recognition software is far from perfect, and we’ve seen how it can be worse for dark-skinned individuals. Combine this with law enforcement’s increasing use of face detection software and it creates a gruesome …

apps artificial artificial intelligence artificial intelligence tools biases consequences criminals dating dating apps facial recognition facial recognition software hiring humans intelligence job justice police recognition search software systems tools

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

AIML - Sr Machine Learning Engineer, Data and ML Innovation

@ Apple | Seattle, WA, United States

Senior Data Engineer

@ Palta | Palta Cyprus, Palta Warsaw, Palta remote