certainty

Certainty

Emotion is a tricky thing. It could be said that if you’re talking about something that’s certain, it isn’t an emotion. But, then uncertainty has its revenge, because certainty itself is an emotion.

Certainty doesn’t feel like an emotion. It feels like a rational decision, a logical conclusion. If we were living in a simple world where physics and chemistry were all that mattered, certainty might be acceptable as a merely logical condition of absolutely firm knowledge… but then, there would be no one in such a stark world to have knowledge.

Certainty is the emotional condition of feeling that one has done enough rigorous thinking to overcome ambiguity. Certainty isn’t just a calculation. It feels good to those who have it.

To have certainty feels like standing on solid ground. It feels secure and absolute.

The trouble is that certainty is often false. It’s an emotion we feel because we are desperate for it. Ironically, we feel certainty in an attempt to cope with the reality that we lack the grounds to be truly sure about what we’re doing.

So it is that people who feel a particularly strong need for certainty are disturbed by the inherent ambiguity of emotion. As a result, they attempt to engineer systems of analysis through which emotions are transformed from fluid feelings into certain objects of study.

Their hearts are in the right place. They really believe that they’re helping people “deal with” emotions by cramming subjective human experience down into a framework into which it doesn’t really fit. Ultimately, though, these projects to make emotion certain miss the mark and lead people down a prickly path of false certainty.

The many Emotion AI services being sold by Silicon Valley companies go into this category. So does the Periodic Table of Emotions project by Aidan Moesby.

Moesby’s aim is an admirable one: To promote emotional granularity, the awareness of a large number of emotions. The problem is with the form that Moesby’s project takes: A periodic table that begins with Happy, with an emotional number of 1, and Euphoric, which has an emotional number of 133.

Why are the emotions given these numbers? Do they correspond with a difference in emotional weight, so that Happy is lightweight, compared to the heavy Euphoric feeling?

Moesby’s table puts each emotion into its own box, declaring it to be an absolute thing, fundamental and distinct from every other emotion thing. That’s not how emotions are actually experienced. They’ve got fuzzy sides. They bleed into each other. Happiness and euphoria share some of the same characteristics, even though they’re not exactly the same. Emotions are blurry

Despite what companies of false certainty like Affectiva would have us believe, emotions can’t simply be added one to another to form predictable emotional compounds. They can’t be reduced down to mathematical formulas of heart rate + skin conductivity + tone of voice + the shape of the mouth.

Anyone who tries to convince you that they can measure and predict emotion with mathematical certainty has an emotional aversion to uncertainty. They might as well be trying to sell you sunshine in a bottle. They’re sure they saw it glimmer within the glass just a minute ago.

These people are hoping that you’ll fall for something known as the McNamara fallacy: The presumption that because something can be measured, it must be the more important to consider than things that cannot be measured. This fallacy is named after General Robert McNamara, who believed that he could win the Vietnam War by carefully measuring and managing metrics such as soldier enlistment and enemy casualties.

Faced with the terrifying unpredictability of war, McNamara clung to the feeling of certainty for relief. He measured his way to defeat.