Article citationsMore>>

Stokel-Walker, C. (2023) Jailbroken AI Chatbots Can Jailbreak Other Chatbots: AI Chatbots Can Convince Other Chatbots to Instruct Users How to Build Bombs and Cook Meth. Scientific American.
https://www.scientificamerican.com/article/jailbroken-ai-chatbots-can-jailbreak-other-chatbots/

has been cited by the following article:

Follow SCIRP
Twitter Facebook Linkedin Weibo
Contact us
customer@scirp.org
WhatsApp +86 18163351462(WhatsApp)
Click here to send a message to me 1655362766
Paper Publishing WeChat
Free SCIRP Newsletters
Copyright © 2006-2025 Scientific Research Publishing Inc. All Rights Reserved.
Top