Nov. 3, 2023, 5:08 p.m. | Dorian Drost

Towards Data Science - Medium towardsdatascience.com

Confusing image-generating AI with poisoned data

Just like the high walls of a castle, Nightshade can be a way to defend intellectual properties against illegitimate use. Photo by Nabih El Boustani on Unsplash

The recent emergence of Nightshade, an algorithm that allows to create poisoned data for confusing image-generating AI models, has given new life to the discussion on adversarial attacks on such models. This discussion is also influenced by ethical and social considerations, as such attacks may provide …

adversarial-attack ai models algorithm data data poisoning emergence generative-ai image life nightshade photo thoughts-and-theory

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne