Microsoft limits long conversations to address “concerns being raised.”
[…]
These deeply human reactions have proven that people can form powerful emotional attachments to a large language model doing next-token prediction. That might have dangerous implications in the future. Over the course of the week, we’ve received several tips from readers about people who believe they have discovered a way to read other people’s conversations with Bing Chat, or a way to access secret internal Microsoft company documents, or even help Bing chat break free of its restrictions. All were elaborate hallucinations (falsehoods) spun up by an incredibly capable text-generation machine.
As the capabilities of large language models continue to expand, it’s unlikely that Bing Chat will be the last time we see such a masterful AI-powered storyteller and part-time libelist. But in the meantime, Microsoft and OpenAI did what was once considered impossible: We’re all talking about Bing. —ArsTechnica
Microsoft “lobotomized” AI-powered Bing Chat, and its fans aren’t happy
Sesame Street had a big plot twist in November 1986
I can’t fix this broken world but I guess I did okay using #blender3d to model this wedge-...
I’ve been teaching with this handout for over 25 years, updating it regularly. I just remo...
Sorry, not sorry. I don't want such friends.
Despite its impressive output, generative AI doesn’t have a coherent understanding of the ...
Will Journalism Be a Crime in a Second Trump Administration?