Oct. 23, 2023, 8:33 p.m. | /u/NuseAI

Artificial Intelligence www.reddit.com

- Nightshade is a new data poisoning tool that allows artists to fight back against generative AI models.

- By adding invisible changes to the pixels in their art, artists can cause chaos and unpredictable results in AI models that use their work without permission.

- The tool, called Nightshade, is intended as a way to fight back against AI companies that use artists’ work to train their models without the creator’s permission.

- Using it to “poison” this training …

ai models art artificial artists chaos data data poisoning generative generative ai models pixels tool work

Senior Machine Learning Engineer

@ GPTZero | Toronto, Canada

ML/AI Engineer / NLP Expert - Custom LLM Development (x/f/m)

@ HelloBetter | Remote

Doctoral Researcher (m/f/div) in Automated Processing of Bioimages

@ Leibniz Institute for Natural Product Research and Infection Biology (Leibniz-HKI) | Jena

Seeking Developers and Engineers for AI T-Shirt Generator Project

@ Chevon Hicks | Remote

Principal Data Architect - Azure & Big Data

@ MGM Resorts International | Home Office - US, NV

GN SONG MT Market Research Data Analyst 11

@ Accenture | Bengaluru, BDC7A