all AI news
Psst … wanna jailbreak ChatGPT? Thousands of malicious prompts for sale
Jan. 25, 2024, 11:01 a.m. | Jessica Lyons Hardcastle
The Register - Software: AI + ML www.theregister.com
Turns out it's pretty easy to make the model jump its own guardrails
Criminals are getting increasingly adept at crafting malicious AI prompts to get data out of ChatGPT, according to Kaspersky, which spotted 249 of these being offered for sale online during 2023.…
adept ai prompts chatgpt criminals data easy guardrails jailbreak kaspersky prompts sale
More from www.theregister.com / The Register - Software: AI + ML
Meta's value plummets as Zuckerberg admits AI needs more time and money
2 days, 16 hours ago |
www.theregister.com
Forget the AI doom and hype, let's make computers useful
2 days, 21 hours ago |
www.theregister.com
Apple releases OpenELM, a slightly more accurate LLM
3 days, 8 hours ago |
www.theregister.com
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Principal Data Engineering Manager
@ Microsoft | Redmond, Washington, United States
Machine Learning Engineer
@ Apple | San Diego, California, United States