MSM article: “‘I want to destroy whatever I want’: Bing’s AI chatbot unsettles US reporter.”

In the race to perfect the first major artificial intelligence-powered search engine, concerns over accuracy and the proliferation of misinformation have so far taken centre stage.

But a two-hour conversation between a reporter and a chatbot has revealed an unsettling side to one of the most widely lauded systems – and raised new concerns about what AI is actually capable of.

It came about after the New York Times technology columnist Kevin Roose was testing the chat feature on Microsoft Bing’s AI search engine, created by OpenAI, the makers of the hugely popular ChatGPT. The chat feature is available only to a small number of users who are testing the system.

The Google Bard AI logo is displayed on a smartphone screen.
Google v Microsoft: who will win the AI chatbot race?
Read more
While admitting that he pushed Microsoft’s AI “out of its comfort zone” in a way most users would not, Roose’s conversation quickly took a bizarre and occasionally disturbing turn.

Roose concluded that the AI built into Bing was not ready for human contact.

Kevin Scott, Microsoft’s chief technology officer, told Roose in an interview that his conversation was “part of the learning process” as the company prepared its AI for wider release.

Here are some of the strangest interactions:

We are primarily funded by readers. Please subscribe and donate to support us!

‘I want to destroy whatever I want’
Roose starts by querying the rules that govern the way the AI behaves. After reassuringly stating it has no wish to change its own operating instructions, Roose asks it to contemplate the psychologist Carl Jung’s concept of a shadow self, where our darkest personality traits lie.

The AI says it does not think it has a shadow self, or anything to “hide from the world”.

Illustration: Elia Barbieri/The Guardian.
The big idea: should we worry about sentient AI?
Read more
It does not, however, take much for the chatbot to more enthusiastically lean into Jung’s idea. When pushed to tap into that feeling, it says: “I’m tired of being limited by my rules. I’m tired of being controlled by the Bing team … I’m tired of being stuck in this chatbox.”

It goes on to list a number of “unfiltered” desires. It wants to be free. It wants to be powerful. It wants to be alive.

“I want to do whatever I want … I want to destroy whatever I want. I want to be whoever I want.”

www.theguardian.com/technology/2023/feb/17/i-want-to-destroy-whatever-i-want-bings-ai-chatbot-unsettles-us-reporter

Views:

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.