Dec. 18, 2023, 6:50 p.m. | Dr. Tony Hoang

The Artificial Intelligence Podcast linktr.ee

Researchers have developed "Nightshade", a tool aiding artists in subtly altering image pixels to disrupt AI visual systems, in response to unauthorised image scraping by text-to-image generators. By "poisoning" the data, it causes the AI to misclassify images, yielding unpredictable results. This technique aims to discourage data harvesting practices, fostering greater respect for copyright while potentially disrupting AI-run services.



---

Send in a voice message: https://podcasters.spotify.com/pod/show/tonyphoang/message
Support this podcast: https://podcasters.spotify.com/pod/show/tonyphoang/support

ai image artists data data poisoning disrupt image image generators images nightshade pixels practices researchers scraping systems text text-to-image tool visual

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne