Oct. 24, 2023, 6:57 p.m. | David Crewe

PetaPixel petapixel.com

A new tool created by a team at the University of Chicago can add undetectable pixels to images to help creatives protect their work from AI image generators by effectively poisoning (corrupting) the AI's training data.


[Read More]

ai ai image ai image generators art artificialintelligence copyright creatives data data poisoning image image generators images industry meta mit nightshade openai photography pixels protect team technology tool training training data university university of chicago work

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US