March 22, 2024, 2:54 p.m. |

Mozilla Foundation Blog foundation.mozilla.org





Artificial intelligence tools can contain many of the same biases that humans do — whether it be search engines, dating apps, or even job hiring software. The problem also exists in systems with even more dire consequences — specifically, the criminal justice system.

Facial recognition software is far from perfect, and we’ve seen how it can be worse for dark-skinned individuals. Combine this with law enforcement’s increasing use of face detection software and it creates a gruesome …

apps artificial artificial intelligence artificial intelligence tools biases consequences criminals dating dating apps facial recognition facial recognition software hiring humans intelligence job justice police recognition search software systems tools

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US