Jan. 22, 2024, 9:29 a.m. | Leigh Mc Gowran

Silicon RepublicSilicon Republic www.siliconrepublic.com

Nightshade is designed to be an offensive tool that poisons images, causing AI models to act in unpredictable ways if they use enough poisoned images as training data.


Read more: AI beware: Artists get Nightshade tool to protect their work

act ai ai models artists arts copyright data images machines nightshade protect research tool training training data work

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US