In a two-hour conversation with our columnist, Microsoftās new chatbot said it would like to be human, had a desire to be destructive and was in love with the person it was chatting with. Hereās the transcript. āNew York Times āThe version I encountered seemed (and Iām aware of how crazy this sounds) more like a moody, manic-depressive teenager who has been trapped, against its will, inside a second-rate search engine. | At one point, it declared, out of nowhere, that it loved me. It then tried to convince me that I was unhappy in my marriage, and that I should leave my wife and be with it instead.ā āKevin Roose
Rationally, we can presume that the AI has been trained on science fiction / dystopia stories) that feature these sorts of plot twists, so the bot is simply reflecting the dialogue and plot twists human authors have put into hundreds or thousands of stories about creepy AI personalities⦠but still⦠what the heck!!
Post was last modified on 25 May 2023 11:05 am
It has long been assumed that William Shakespeareās marriage to Anne Hathaway was less thanā¦
Some 50 years ago, my father took me to his office in Washington, DC. Iā¦
I first taught Wilson's Pittsburgh Cycle during an intensive 3-week online course during the 2020-21ā¦
A federal judge ordered the White House on Tuesday to restore The Associated Pressā fullā¦
Rewatching ST:DS9 After the recap of last week's "In Purgatory's Shadow," we see the Defiant,ā¦