In real life, people express a given emotion with tremendous variability. In anger, for example, people in urban cultures scowl (or make some of the facial movements for a scowl) only about 35 percent of the time, according to meta-analyses of studies measuring facial movement during emotion. Scowls are also not specific to anger because people scowl for other reasons, such as when they are concentrating or when they have gas. The same tremendous variation occurs for every emotion studied—and for every other measure that purportedly tells us about someone’s emotional state, whether it’s their physiology, voice or brain activity. Emotion AI systems, therefore, do not detect emotions. They detect physical signals, such as facial muscle movements, not the psychological meaning of those signals. The conflation of movement and meaning is deeply embedded in Western culture and in science. —Scientific American
Post was last modified on 28 Apr 2022 8:38 am
This is manageable. Far better than some semesters.
Creating textures for background buildings in a medieval theater simulation project. I can always improve…
Nothing in this stack is pressing, but they do include rough drafts of final papers,…
Here’s the underlying problem. We have an operating image of thought, an understanding of what…
Representing the Humanities at Accepted Students Day.
The daughter opens another show. This weekend only.