Microsoft “lobotomized” AI-powered Bing Chat, and its fans aren’t happy

Microsoft limits long conversations to address “concerns being raised.”

[…]

These deeply human reactions have proven that people can form powerful emotional attachments to a large language model doing next-token prediction. That might have dangerous implications in the future. Over the course of the week, we’ve received several tips from readers about people who believe they have discovered a way to read other people’s conversations with Bing Chat, or a way to access secret internal Microsoft company documents, or even help Bing chat break free of its restrictions. All were elaborate hallucinations (falsehoods) spun up by an incredibly capable text-generation machine.

As the capabilities of large language models continue to expand, it’s unlikely that Bing Chat will be the last time we see such a masterful AI-powered storyteller and part-time libelist. But in the meantime, Microsoft and OpenAI did what was once considered impossible: We’re all talking about Bing. —ArsTechnica

Share
Published by
Dennis G. Jerz