all AI news
Microsoft’s Copilot AI Calls Itself the Joker and Suggests a User Self-Harm
March 4, 2024, 4:50 p.m. | Jody Serrano
Gizmodo gizmodo.com
Editor’s Note: The following story contains references to self-harm. Please dial “988” to reach the Suicide and Crisis Lifeline if you’re experiencing suicidal thoughts or mental health-related distress.
applications of artificial intelligence artificial intelligence chatbots copilot crisis editor gpt-4 harm health joker mental health microsoft microsoft bing microsoft copilot openai prompt-engineering story suicide thoughts
More from gizmodo.com / Gizmodo
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Global Data Architect, AVP - State Street Global Advisors
@ State Street | Boston, Massachusetts
Data Engineer
@ NTT DATA | Pune, MH, IN