Being out of sorts isn’t really the same thing as feeling discombobulated or the emotion of ilinx. Ilinx refers to a specific feeling of exhilarated disorientation that comes from being spun around and around until all feeling or normal balance is lost. Discombobulation is the emotion of being figuratively bent out of shape. Feeling out of sorts, on the other hand, is an uneasy sensation that ordinary categories have lost their relevance.
When people feel out of sorts, it seems for them as if things that had been comfortably sorted, each given its certain place, have begun to flow out of their assigned roles. Perhaps parts of their bodies aren’t providing the feelings that they are supposed to, but it could just as easily be people in their lives who aren’t following the neat and easy scripts they’re expected to follow. Even more fundamentally, the concepts according to which people and things had been sorted may have begun to dissolve into incoherence.
Digital engineers may feel out of sorts when they encounter the ambiguous realm of emotion, in which things cannot be easily sorted into binary categories of on and off, good and bad, yes and no. Perhaps that’s why they’ve tried to develop Emotion AI – a kind of machine learning technology that tries to detect and predict human emotion. Emotion AI doesn’t actually work, and the outdated scientific theories upon which Emotion AI is founded have been thoroughly debunked.
Even as evidence that Emotion AI is junk science piles up, Silicon Valley hucksters keep on coming up with new Emotion AI schemes, and tech reporters eagerly write stories about the unproven devices. The reason for this persistence has more to do with ideology than honest psychology. Tech culture is predicated upon faith that absolutely everything can be operationalized into an algorithm and then automated for optimum efficiency – never mind that many of the most important experiences in life have nothing at all to do with predictability or efficiency.
There are two ways to react to the cognitive dissonance that makes us feel out of sorts: 1) We can develop new ways of sorting out our place in the world, and learn to sort some things into a category of things that can’t be definitively sorted, or 2) We can reject reality, insisting that our systems of categorization are more real than the things they’re supposed to describe, and double down, developing new tricks for making it seem as if things actually match our theories for how the world ought to be.
The architects of Emotion AI appear to have committed themselves to the second option. They may feel that they have no choice. Sortem, the Latin word that the concept of sorting has been constructed from, referred not only to categorization, but to a kind of prophecy, a belief in a pre-ordained destiny for the world.
If believers in the Emotion AI ideology were to admit that their technologies fail to accurately and adequately describe the experience of human emotion, they would have to abandon faith in the prophecy that soon, everything will be digitized, and the world will encounter an artificial intelligence End Times they call The Singularity.