Bing’s A.I. Chat Reveals Its Feelings: ‘I Want to Be Alive. 😈’

In a two-hour conversation with our columnist, Microsoft’s new chatbot said it would like to be human, had a desire to be destructive and was in love with the person it was chatting with. Here’s the transcript. —New York Times

“The version I encountered seemed (and I’m aware of how crazy this sounds) more like a moody, manic-depressive teenager who has been trapped, against its will, inside a second-rate search engine. | At one point, it declared, out of nowhere, that it loved me. It then tried to convince me that I was unhappy in my marriage, and that I should leave my wife and be with it instead.” —Kevin Roose

Rationally, we can presume that the AI has been trained on science fiction / dystopia stories) that feature these sorts of plot twists, so the bot is simply reflecting the dialogue and plot twists human authors have put into hundreds or thousands of stories about creepy AI personalities… but still… what the heck!!

Post was last modified on 25 May 2023 11:05 am

Share
Published by
Dennis G. Jerz
Tags: aillm

Recent Posts

Another corner building. Designed and textured. Needs an interior. #blender3d #design #aesthetics #medievalyork #mysteryplay

Another corner building. Designed and textured. Needs an interior. #blender3d #design #aesthetics #medievalyork #mysteryplay

6 hours ago

There’s No Longer Any Doubt That Hollywood Writing Is Powering AI

Two years after the release of ChatGPT, it may not be surprising that creative work…

2 days ago

The complex geometry on this wedge building took me all weekend.  #blender3d #medievalyork #mysteryplay #cgi #aesthetics #design

The complex geometry on this wedge building took me all weekend. The interior walls still…

4 days ago

Sesame Street had a big plot twist in November 1986

My older siblings say they remember our mother sitting them down to watch a new…

5 days ago