In real life, people express a given emotion with tremendous variability. In anger, for example, people in urban cultures scowl (or make some of the facial movements for a scowl) only about 35 percent of the time, according to meta-analyses of studies measuring facial movement during emotion. Scowls are also not specific to anger because people scowl for other reasons, such as when they are concentrating or when they have gas. The same tremendous variation occurs for every emotion studied—and for every other measure that purportedly tells us about someone’s emotional state, whether it’s their physiology, voice or brain activity. Emotion AI systems, therefore, do not detect emotions. They detect physical signals, such as facial muscle movements, not the psychological meaning of those signals. The conflation of movement and meaning is deeply embedded in Western culture and in science. —Scientific American
Post was last modified on 28 Apr 2022 8:38 am
It has long been assumed that William Shakespeare’s marriage to Anne Hathaway was less than…
Some 50 years ago, my father took me to his office in Washington, DC. I…
I first taught Wilson's Pittsburgh Cycle during an intensive 3-week online course during the 2020-21…
A federal judge ordered the White House on Tuesday to restore The Associated Press’ full…
Rewatching ST:DS9 After the recap of last week's "In Purgatory's Shadow," we see the Defiant,…
Rewatching ST:DS9 Kira helps Odo re-adjust to life as a shape-shifter, obliviously but brutally friendzoning…