all AI news
Artists can now poison their images to deter misuse by AI
Jan. 20, 2024, 8:20 p.m. | /u/NuseAI
Artificial Intelligence www.reddit.com
- Nightshade is a prompt-specific poisoning attack that blurs the boundaries of concepts in images, making text-to-image models less useful.
- The tool aims to protect content creators' intellectual property and ensure that models only train on freely offered data.
- Artists can use Nightshade to prevent the capture and reproduction of their visual styles, …
ai models artificial artists concepts data files image images making misuse nightshade prompt text text-to-image tool university university of chicago
More from www.reddit.com / Artificial Intelligence
Researchers Train AI Doctors In Hospital Simulation
1 day, 16 hours ago |
www.reddit.com
OpenAI’s Long-Term AI Risk Team Has Disbanded
2 days, 10 hours ago |
www.reddit.com
Jobs in AI, ML, Big Data
Software Engineer for AI Training Data (School Specific)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Python)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Tier 2)
@ G2i Inc | Remote
Data Engineer
@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US