Bing’s AI chatbot helped me solve a technical problem, then showed me manatees, then denied showing me manatees

I was searching for how to change the default font in the WordPress editor, but the answers I was getting were overwhelmingly for how to change what the user sees in published posts, which I already know how to do.

Bing’s response was right on target. It probably helped that I knew the specific name of the WordPress internal editor (TinyMCE), and I’m not sure why Bing showed me some pictures of manatees, which it promptly denied doing.

It turns out the code Bing generated for me did let me change the default text in the editor window, but students whose goal is to get points for submitting “the correct answer” won’t be able to check Bing’s responses so easily. When I ask students to do research, I am not asking them to spit back a “correct answer” someone else has already published; I am instead asking them to make original connections between diverse sources in order to defend their original ideas.

The more students rely uncritically on bots to frame and develop their thinking, the less intellectually equipped they will be to spot a worthless response — with or without unexplained manatees.

Leave a Reply

Your email address will not be published. Required fields are marked *