In a two-hour conversation with our columnist, Microsoft’s new chatbot said it would like to be human, had a desire to be destructive and was in love with the person it was chatting with. Here’s the transcript. —New York Times “The version I encountered seemed (and I’m aware of how crazy this sounds) more like a moody, manic-depressive teenager who has been trapped, against its will, inside a second-rate search engine. | At one point, it declared, out of nowhere, that it loved me. It then tried to convince me that I was unhappy in my marriage, and that I should leave my wife and be with it instead.” —Kevin Roose
Rationally, we can presume that the AI has been trained on science fiction / dystopia stories) that feature these sorts of plot twists, so the bot is simply reflecting the dialogue and plot twists human authors have put into hundreds or thousands of stories about creepy AI personalities… but still… what the heck!!
Post was last modified on %s = human-readable time difference 11:05 am
I just caught myself thinking, “This doesn’t suck.” #medievalyork #mysteryplay #blender3d #design #aesthetics
As part of an ongoing bid to get his hands on Vantablack, the super dark, light-absorbing material to…
Predatory publications are not concerned with writing quality (or even coherence), and thus also do…
Our family has seen David Whalen in the persona of Sherlock in five or six…
“While my dad has always spoken lovingly and fondly of his old Camaro he sold…
“While my dad has always spoken lovingly and fondly of his old Camaro he sold…