Think of ChatGPT as a blurry jpeg of all the text on the Web. It retains much of the information on the Web, in the same way that a jpeg retains much of the information of a higher-resolution image, but, if you’re looking for an exact sequence of bits, you won’t find it; all you will ever get is an approximation. But, because the approximation is presented in the form of grammatical text, which ChatGPT excels at creating, it’s usually acceptable. You’re still looking at a blurry jpeg, but the blurriness occurs in a way that doesn’t make the picture as a whole look less sharp.
This analogy to lossy compression is not just a way to understand ChatGPT’s facility at repackaging information found on the Web by using different words. It’s also a way to understand the “hallucinations,” or nonsensical answers to factual questions, to which large language models such as ChatGPT are all too prone. —New Yorker
Post was last modified on 13 Jun 2023 10:23 pm
At the airport, sending the daughter off to Ireland. She: “Use that one, because we…
Madi Hime is taking a deep drag on a blue vape in the video, her…
My simulation project does not require me to master medieval daub and wattle architecture. But…
Hooded faculty and staff are loaded into the launch tube below Sullivan. @setonhilluniversity
Have you seen Apple's "Crush" ad? It features a huge huge hydraulic press crushing musical…
I’m still teaching journalism and my usual courses, but after 21 years I’ve stepped aside…