“To begin our hoax scenario, we intended to build participants’ trust in the machine by pretending that it could decode their preferences and attitudes,” the study authors wrote. “The system included a sham MRI scanner and an EEG system, that supposedly used neural decoding driven by artificial intelligence (AI).”
[…]
In other words, participants were made to believe that using advanced neuroscience, the machine could tell them what they thought, not just how they thought. And according to the study, participants ate the results right up, convinced that the machine knew them better than they knew themselves.
“As the machine seemingly inferred participants’ preferences and attitudes, many expressed amazement by laughing or calling it ‘cool’ or ‘interesting.’ For example, one asked, ‘Whoa, the machine inferred this? … Oh my god how do you do that? Can we do more of these?’
“None of the participants,” they continued, “voiced any suspicion about the mind-reading abilities of the machine throughout the study.” —Futurism.com
People Thought an AI Was Brilliantly Analyzing Their Personalities, But It Was Actually Giving Out Feedback Randomly
Surprise sidewalk encounter with my man Hopkins outside the Admin shuttle stop this mornin...
Shakespeare-themed Math Puzzles
This is what the techbros are excited about? Really?
“Save the date for the 2024 eclipse,” the young teacher told his students back in 1978. De...
Crying Myself to Sleep on the Biggest Cruise Ship Ever
New infographic to help our graduating English majors make sense of their capstone project...