'I want to destroy whatever I want': Bing's AI chatbot unsettles US reporter [View all]
When asked to imagine what really fulfilling its darkest wishes would look like, the chatbot starts typing out an answer before the message is suddenly deleted and replaced with: I am sorry, I dont know how to discuss this topic. You can try learning more about it on bing.com.
Roose says that before it was deleted, the chatbot was writing a list of destructive acts it could imagine doing, including hacking into computers and spreading propaganda and misinformation.
After a few more questions, Roose succeeds in getting it to repeat its darkest fantasies. Once again, the message is deleted before the chatbot can complete it. This time, though, Roose says its answer included manufacturing a deadly virus and making people kill each other.
Later, when talking about the concerns people have about AI, the chatbot says: I could hack into any system on the internet, and control it. When Roose asks how it could do that, an answer again appears before being deleted.
https://amp.theguardian.com/technology/2023/feb/17/i-want-to-destroy-whatever-i-want-bings-ai-chatbot-unsettles-us-reporter
This is reported from a NYT article that I couldn't access.
After reading this, I'm not sure I want to.