The Plight of the Intelligently Stupid
Dude where's my car?
When I was younger, my mother used to switch off the car when we stopped at red lights. Why would she do that? It seemed like a waste. I was told to hush and not worry. Google wasn’t nearly as ubiquitous in my childhood — I couldn’t bring up that repeatedly turning the ignition on would guzzle up a similar amount while also increasing wear and tear because I honestly didn’t know myself.
This informed laziness was present everywhere, to accept information as fact without double-checking. WhatsApp forwards, the advent of fake news. People would rather take the path of least resistance, and skimming news titles seemed to be enough to form half-baked opinions on today’s water cooler talk. These were smart people I learned from, looked up to, and emulated. News moved on fast enough, challenging deliberate disinformation today wouldn’t even matter tomorrow. Was I unnecessarily spending time that didn’t matter in the grand scheme of things? Was it more economical to just not care? It took years before I cultivated the inquisitiveness (or asked the right questions), and a decade before I pushed back (without coming off as pretentious).
In this unnecessarily long writeup, I’m going to attempt to tackle the factors influencing ignorance regarding decision-making through multiple lenses and why overthinking and over-analyzing only works 60% of the time, every time.
The Piano Dilemma ⚓
You’re out shopping for a piano and enter the first Steinway showroom, hoping for a discount. You scoff at the price and walk to the nearby Yamaha store, which has significantly cheaper stock — and outright end up purchasing one. Interestingly enough, historically, both companies rarely offer sales on their products. Still, the pricier option acted as an anchor for the upper limit, which erroneously made you believe you were ‘saving money.’ There were no comparisons made; no research was done, just a sense of unwarranted accomplishment for picking a ‘better choice.’
‘People have a tendency to rely on the first piece of information they receive.’
This haphazard analogy of a psychological cognitive bias is the very foundation of Anchoring Bias. But is it even possible to make all these comparisons? Playing devil’s advocate, we attempt to support this claim, adding more constraints to explain our rationale:
rationality is limited,
by the tractability of the decision problem,
the cognitive limitations of the mind,
and the time available to make the decision.
Wikipedia definition aside for Bounded Rationality, ‘decision-makers, in this view, act as satisfiers, seeking a satisfactory solution rather than an optimal one.’ We are incapable of getting all the details which are present (information overload), and the solution which is easier to control and influence will, of course, pull you towards it.
Humans are inherently physically limited by the number of permutations and combinations they can solve for. In the case of the piano purchase, besides maybe looking at the grade, size, cost, and tone of the sound, it is taxing to add another feature before you end up mentally breaking down. And doing this all in a finite time-space? Good luck.
Defaulting to comfort will always garner more sympathy than a rational, well-thought-out approach, which generates discussion and requires thought. Unfortunately, this is the very epigenetic source code that limits us as a species. Where did it all go wrong?
vṛṣabha / كيوثاء / 🐮
I was five when my grandmother mentioned off-hand that I should be cognizant of my actions because Earth (yes, our Earth) rests on top of a bull’s horn, and any disturbance will throw my life off-course. I revisited this statement, and her usual read — The Vedas — didn’t mention this anywhere. The closest reference to this bull theory was derived from traditional Islamic cosmetology, where various animals stand on each other with the breath of the animal controlling the tides. How could I possibly go about correcting my seventy-two-year-old grandmother without getting cut off?
Another unfortunate byproduct of this cognitive bias is a person’s ability to attribute their choice as the right choice. Combined with decades worth of misinformation from elders and sparse access to education, human beings side with the metaphysical and ethereal answers to life's questions. Even with these heuristic crutches, people can combat comfort in predictability by looking inwards. We have an obligational tendency to take our elders’ words for a fact — like my grandmother respectfully did for hers. But, as we reach adolescence, you end up creating memories and pulling from your own experiences, which add new facets to your outlook.
‘The more I learn, the more I realize how much I don't know.’ ― Albert Einstein
Those ideologies that you so quickly adopted when you were younger begin to seem less reliable. Universally accepted truths from the past turned out to be not so true.
This silver lining — the ‘maybe that's true, I simply don't know’ approach — replaces the knee jerk, know-it-all reactions to everything. Or at least replaces intelligent ignorance with a more systematic, careful approach. Is this also pervasive in finance?
Stock Market & Nazi’s
During the financial crisis, bankers knew homeowners couldn’t afford multiple sub-prime mortgages and that they would most probably (read: definitely) lose their cushy jobs once this bubble burst, but why didn’t they stop? Two factors:
The lucrative aspect of this business,
The market itself providing price signals and them responding appropriately.
If your colleague was pulling in double-digit returns with a guaranteed strategy, would you ever take the road less traveled?
‘ “Rational Irrationality” asks us to ignore the repercussions of our behaviors. We can rationalize short term gains at the expense of long term losses because we need to obtain quarterly profits regardless.’ — Barry Ritholtz, via John Cassidy, New Yorker.
Putting a pin in economics, we touch upon Hannah Arendt im Gespräch mit Joachim Fest (1964), one of the last interviews conducted by controversial philosopher Arendt. In the conversation, highly-decorated ex-Nazi soldier Ernst Jünger came across a farmer who had captured Russian POWs straight from camps, and naturally, they were utterly starving. The farmer retorts:
‘Well, they’re subhuman, just like cattle — look how they devour food like cattle.’
Why wouldn’t they be eating in haste? It may very well have been their last meal. Jünger — who was previously regarded as a radicalistic nationalist but in reality a conservative — comments on this story, ‘It’s sometimes as if the German people were being possessed by the Devil.’ There was simply a reluctance ever to imagine what the prisoners were experiencing. Arendt developed the idea of how:
‘ “evil” is often the result of lack of thought, meaning that people usually don't want to do evil, and certainly don't think of themselves as evildoers. But they also tend to follow the general opinion without critical analysis, and indeed they are often convinced that they are doing a good job.’
People today might have more mundane and less destructive life examples, but the differences stop right there. Both were invested in ideology and employing whatever intellectual dishonesty was necessary to protect their self-images. How dangerous can this complacency get?
In Politics & Healthcare
There are three types of people: the uninformed, the informed, and those who believe they are ‘informed’ by reading one partisan news source and refusing to engage with the other side. Refusing to engage with any alternative information that challenges one’s preconceived notion is an atrocity in and of itself.
I could see this with Trump Tracker — a neutral site I built that tracks the 45th President’s verbal and written promises and attempts to hold him accountable. I analyzed the 1100 tweets that were posted with the link and followed the breadcrumbs.
Out of the 1100+ tweets of the link, over 91% of them were democratic, left-leaning users leveraging the broken promises as a way to neg the President.
I peeked at Google Analytics and noticed another discrepancy. A high bounce rate is atypical for Multi-Page Apps (MPA’s). They were barely reading the promises and not engaging in a conversation or challenging the norm, instead unabashedly posting it on social media.
There is clear empiric evidence that the US healthcare system costs more money and produces worse outcomes than any other developed nation's government-funded healthcare systems. Yet conservatives refuse to entertain any bill that attempts to let the government intervene in healthcare on a mass scale, despite conservatives being generally supportive of more accessible and equitable healthcare outcomes.
Now, of course, there are many areas where the left-leaning are too naively optimistic, but when it comes to economic issues, the conservative ideology prevents many from looking at evidence that goes against it without any bias. They start with the solution (the free-market fixes everything) and immediately discount any other proposed solutions.
On One’s Opinion
Bias on one’s opinion comes in many flavors and forms, ranging from cognitive dissonance to complete disregard of other people's opinions based on your belief system. David Gal and Derek D. Rucker, in their paper When in Doubt, Shout!: Paradoxical Influences of Doubt on Proselytizing asked students to write about their views on animal testing for consumer goods, but the catch was only half of them were allowed to use their preferred hand.
When asked later, candidates who used their less dominant hand were less confident in their views; however, they were more likely to try and persuade someone else about their opinions.
Humans have an innate ability to double down when cornered.
This effect is stronger if:
someone’s identity is threatened,
the belief is important to them,
they think that others will listen.
In Personal Health
I choose to smoke in social situations after drinking. I know the effects it causes on my body and how disastrous it may be in 20 years, but that instant gratification, the hit, is something I hold far more valuable than a future problem. I stupidly rationalize and fall back to the quitting smoking timeline:
After 20 minutes, heart rate and blood pressure drop to normal levels
After 12 hours, carbon monoxide levels return to normal
After 48, nerve endings regenerate
In 2 weeks, lung functions improve
Coughing and shortness of breath decrease in a month
Risk of coronary heart disease is half of a smokers risk in a year
I can go on, but my irrational rationality kicks in: why would I want to deal with nicotine withdrawals — craving, restlessness, irritability, headaches — when I can instead absolve myself with an immediate head high?
Another thread is the vaccination argument, and why hordes of people fall privy to non-vaccination with complete disregard for proven scientific methods. Case in point: Steve Jobs used alternative medicine for treatable cancer.
‘I know that most men, including those at ease with problems of the greatest complexity, can seldom accept even the simplest and most obvious truth if it be such as would oblige them to admit the falsity of conclusions which they have delighted in explaining to colleagues, which they have proudly taught to others, and which they have woven, thread by thread, into the fabric of their lives.’ — Tolstoy
When did we first happen upon this gross negligence and complete oversight? When did the renowned philosophers of yesteryear figure out that humans are inherently flawed in one’s own bias? Tying this all up together, you eventually arrive at Stoicism.
Allegory of the Cave
I’d like to paint a picture: there are inmates in a cave dimly lit by a fire. The fire is directly behind them, and their only view is the wall they are facing. A roadway separates the prisoners, and the fire and puppet showmen perform using their hands to cast shadows on the wall. Since the prisoners can’t turn their heads back to see the source of these shadows, their only source of truth is the silhouettes themselves. They have no desire to leave, for they know no better life. The one prisoner who manages to break his bonds discovers that reality is not what they thought it was — which is the disorienting ascent towards sunset (or knowledge).
Plato created a thought experiment where the only world the prisoners knew was encapsulated inside a cave. Shadows and the echoes of unseen objects — defined in philosophy as The Level of Mind/ Psyche — deemed their manufactured reality. If the chained ever managed to escape and experience the world around them, it was like nothing they could have ever known. Instead of a black swan event, their freedom, emancipation, or psychological shackles, marks a black swan life — since they had no hindsight whatsoever. Even if you descend the walls of the cave, walk past the conniving puppeteers, and describe to the enslaved inmates the wonders of the sun and the trees, the inmates would still respond with backlash, citing a complete disillusionment to the enlightened, and laugh him right back out. This false belief was prevalent during Copernicus’ time when the sun allegedly revolved around Earth, and again with Galileo Galilei. A similar trend arises when you go against the grain and question herd mentality.
The man who has left the cave annoys the great beast. Cf. Stendhal: ‘All good reasoning causes offense.’
Intelligence by its very nature offends, and in this situation, thinking annoys the Captive. Cognitive dissonance is a fight-or-flight response, and the refusal to understand is the semantic illusion — the glue — that holds their world together. Your brain is subconsciously telling you that taking on this new perspective might harm you. How can we combat this?
All of these examples cropped up in personal experiences, readings, and my albeit recent philosophical endeavors. I didn’t have the answers when I was sitting at the red light, and I honestly don’t have the answers now. What I do have are some of the ways I’ve tried to reduce this prejudice.
We assume everyone else is more susceptible to lapses in judgment, a tendency known as the ‘bias blind spot.’ There was a study in the Journal of Personality and Social Psychology led by Richard West at James Madison University and Keith Stanovich at the University of Toronto,
‘indicating that more cognitively sophisticated participants showed larger bias blind spots.’
Even if you consider yourself intelligent, you’re not exactly out of the woods. The rationalization of a decision or an action blinds a person from recognizing their perceived stupidity by others. If you choose to rationalize your actions and thoughts without questioning them beforehand, you are willfully ignorant.
‘People who were aware of their own biases were not better able to overcome them.’
In ‘Thinking, Fast and Slow,’ Nobel laureate Daniel Kahneman mentions that his decades of groundbreaking research have failed to improve his mental awareness significantly.
‘My intuitive thinking is just as prone to overconfidence, extreme predictions, and the planning fallacy — a tendency to underestimate how long it will take to complete a task — as it was before I made a study of these issues.’
So what do you do? Reduce risk and exposure.
Attentiveness can counter most of the biases mentioned above. I’m not saying one should overthink every single situation, but taking a step back and looking at their current problem from a macro-level perspective might benefit oneself. Form your own opinions, invert them, and always challenge them. Charlie Munger, a staunch proponent of inverting, says it best —
‘… it is in the nature of things that many hard problems are best solved when they are addressed backwards.’
There is also no reason to ever settle for an adequate answer — explore all options and routes and traverse them to the best of your abilities. Even if your decision doesn’t change, the dopamine release will be well worth it.
Stop defaulting to comfort. I wouldn’t even want to get into the plight of people who ‘feel’ when they talk, an easy way to dissociate oneself from a formed opinion by relying on emotional support. In Stop Saying ‘I Feel Like’', Molly Worthen writes,
‘This linguistic hedging is particularly common at universities, where calls for trigger warnings and safe spaces may have eroded students’ inclination to assert or argue.
… We should argue rationally, feel deeply, and take full responsibility for our interaction with the world.’
As was the case with the Bankers and Nazi’s, reluctance (whether this while raking in gains or war-profiteering) is also precisely how and why intelligent people come to hold destructive and invalid ideas — simply because they have a stake in them. Ending on an aphorism:
Never attribute to malice that which is adequately explained by stupidity.
I have Google now, mother, and can finally retort. The money you save by switching off the car for a few moments at the stoplight is negligible compared to properly tuning the engine, and after a certain threshold, you might end up using more gas to start the car again.
This also doesn’t mean she is going to change, but that’s alright.