Dec. 18, 2023, 6:50 p.m. | Dr. Tony Hoang

The Artificial Intelligence Podcast linktr.ee

Researchers have developed "Nightshade", a tool aiding artists in subtly altering image pixels to disrupt AI visual systems, in response to unauthorised image scraping by text-to-image generators. By "poisoning" the data, it causes the AI to misclassify images, yielding unpredictable results. This technique aims to discourage data harvesting practices, fostering greater respect for copyright while potentially disrupting AI-run services.



---

Send in a voice message: https://podcasters.spotify.com/pod/show/tonyphoang/message
Support this podcast: https://podcasters.spotify.com/pod/show/tonyphoang/support

ai image artists data data poisoning disrupt image image generators images nightshade pixels practices researchers scraping systems text text-to-image tool visual

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US