Jan. 22, 2024, 9:29 a.m. | Leigh Mc Gowran

Silicon RepublicSilicon Republic www.siliconrepublic.com

Nightshade is designed to be an offensive tool that poisons images, causing AI models to act in unpredictable ways if they use enough poisoned images as training data.


Read more: AI beware: Artists get Nightshade tool to protect their work

act ai ai models artists arts copyright data images machines nightshade protect research tool training training data work

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Consultant - Artificial Intelligence & Data (Google Cloud Data Engineer) - MY / TH

@ Deloitte | Kuala Lumpur, MY