The top suggestion for a Holocaust search no longer points to a denier’s website, but Google’s search algorithm still makes some truly awful suggestions. (I’ve turned off the suggested search feature on my browser.)
The top result for “Black lives matter is a hate group,” for instance, leads to a link by the Southern Poverty Law Center that explains why it does not consider Black Lives Matter a hate group. That’s not always the case, however. “Hitler is my hero” dredges up headlines like “10 Reasons Why Hitler Was One of the Good Guys,” one of many pages Cadwalladr pointed out more than a year ago. | These autocomplete suggestions aren’t hard-coded by Google. They’re the result of Google’s algorithmic scans of the entire world of content on the internet and its assessment of what, specifically, people want to know when they search for a generic term. “We offer suggestions based on what other users have searched for,” Gingras said at Thursday’s hearing. “It’s a live and vibrant corpus that changes everyday.” Often, apparently, for the worse. —Wired
Post was last modified on 10 Feb 2018 10:02 pm
Another corner building. Designed and textured. Needs an interior. #blender3d #design #aesthetics #medievalyork #mysteryplay
What have my students learned about creative nonfiction writing? During class they are collaborating on…
Two years after the release of ChatGPT, it may not be surprising that creative work…
I both like and hate that Canvas tracks the number of unmarked assignments that await…
The complex geometry on this wedge building took me all weekend. The interior walls still…