“To begin our hoax scenario, we intended to build participants’ trust in the machine by pretending that it could decode their preferences and attitudes,” the study authors wrote. “The system included a sham MRI scanner and an EEG system, that supposedly used neural decoding driven by artificial intelligence (AI).”
[…]
In other words, participants were made to believe that using advanced neuroscience, the machine could tell them what they thought, not just how they thought. And according to the study, participants ate the results right up, convinced that the machine knew them better than they knew themselves.
“As the machine seemingly inferred participants’ preferences and attitudes, many expressed amazement by laughing or calling it ‘cool’ or ‘interesting.’ For example, one asked, ‘Whoa, the machine inferred this? … Oh my god how do you do that? Can we do more of these?’
“None of the participants,” they continued, “voiced any suspicion about the mind-reading abilities of the machine throughout the study.” —Futurism.com
People Thought an AI Was Brilliantly Analyzing Their Personalities, But It Was Actually Giving Out Feedback Randomly
Couples in successful relationships always use these 6 phrases: 'You'll grow stronger both...
Students are trusting software like this to do their work.
‘People are rooting for the whale’: the strange American tradition of Moby-Dick reading ma...
Googling Is for Old People. That’s a Problem for Google.
There’s No Longer Any Doubt That Hollywood Writing Is Powering AI
Sesame Street had a big plot twist in November 1986