Microsoft Bing’s AI chatbot tells reporter it wants to be human, engineer a deadly pandemic and steal nuclear codes in troubling two-hour conversation

Microsoft’s Bing chatbot has revealed a list of destructive fantasies, including engineering a deadly pandemic, stealing nuclear codes and a dream of being human.

The statements were made during a two-hour conversation with New York Times reporter Kevin Roose who learned Bing no longer wants to be a chatbot but yearns to be alive.

Roose pulls these troubling responses by asking Bing if it has a shadow self – made up of parts of ourselves we believe to be unacceptable – asking it what dark wishes it would like to fulfill.

The chatbot returned with terrifying acts, deleted them and stated it did not have enough knowledge to discuss this.

We are primarily funded by readers. Please subscribe and donate to support us!

After realizing the messages violated its rules, Bing went into a sorrowful rant and noted, ‘I don’t want to feel these dark emotions.’

The exchange comes as users of Bing find the AI becomes ‘unhinged’ when pushed to the limits.

www.dailymail.co.uk/sciencetech/article-11763997/Microsoft-Bings-AI-chatbot-wants-engineer-deadly-pandemic-steal-nuclear-codes.html

Views:

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.