Oct. 25, 2023, 2:28 p.m. | Emilia David

The Verge - All Posts www.theverge.com


Image: OpenAI


Fighting against data used to train AI models has become more poisonous.


A new tool called Nightshade allows users to attach it to their creative work, and it will corrupt — or poison — training data using that art. Eventually, it can ruin future models of AI art platforms like DALL-E, Stable Diffusion, and Midjourney, removing its ability to create images.


Nightshade adds invisible changes to pixels in a piece of digital art. When the work is ingested …

ai models art artists become creative dall dall-e data data poisoning eventually future image nightshade openai scraping tool train train ai training training data work

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York