Fine tuning your “baloney detection” meter

by Mawer Investment Management, via The Art of Boring Blog

Increasing our ability to detect falsehoods can make us better decision makers.

Donald Trump leans into the microphone. “I will be the greatest jobs president that God ever created,” Trump proclaims. “I’ll bring back our jobs from China, from Mexico, from Japan, from so many places. I’ll bring back our jobs and I’ll bring back our money.”

Wearing a blue power suit and red tie, the billionaire real estate developer looks and sounds every bit the outlandish megalomaniac we think him to be. A thunderous applause erupts. “We need you!” a man yells from above. Trump smiles like he just closed a deal. Elsewhere, political economist David Ricardo rolls over in his grave.

Like the Roman and Chinese emperors of old, Trump seems to believe he has omnipotent powers. He makes grandiose promises of saving America from its many supposed foes, all the while bidding the public to trust him—not because he has a plan, but because he is Donald Trump and a billionaire. He implies that he is the only person who can make the deals that will brighten America’s economic future.

Most people who listen to Trump know better than to take him at his word; however, Trump’s popularity and early lead in the polls suggest that some individuals have faith in his promises. This leaves us wondering: when an authority figure makes a claim, how can we separate the fact from the fiction?

We live in a world full of “baloney.” We are bombarded by marketing claims that may or may not reflect the truth, we listen to political candidates make ostentatious claims about how they will turn countries around, and as investors, we listen to management teams outline their grandiose plans for growth.

But detecting baloney is neither easy nor obvious—particularly when some of that baloney is our own. A carefully honed “Baloney Detection Meter”—a framework to increase our ability of detecting falsehoods—can help us uncover the fallacies in our lives, make more effective decisions, and avoid unnecessary challenges.

The Enemy Within

Sasha was very young when she first learned about death. The little girl was sitting down for a family dinner when she asked her father about his parents. She knew her maternal grandparents intimately, but why hadn’t she met his parents?

“Because they died,” he answered wistfully.

Sasha asked, “Will you ever see them again?”

Sasha remembers her father weighing his answer carefully:

“...he said that there was nothing he would like more in the world than to see his mother and father again, but that he had no reason—and no evidence—to support the idea of an afterlife, so he couldn’t give in to the temptation.”

“Why?”

“...He then told me, very tenderly, that it can be dangerous to believe things just because you want them to be true. You can get tricked if you don’t question yourself and others, especially people in a position of authority. He told me that anything that’s truly real can stand up to scrutiny.”1

Sasha's father was the late Carl Sagan, a cosmologist and science commentator whose legendary curiosity and prolific writings inspired millions. His work shone a light of objectivity and scientific reasoning on a world he saw dominated by pseudoscience and mysticism. Indeed, it was his work in The Demon-Haunted World: Science as a Candle in the Dark that provided the inspiration for our team’s own “Baloney Detection Meter.”

It is worthwhile to reflect on Sasha’s interaction with her father and what it teaches. It illustrates that we must be aware of how we can get in our own way when trying to spot baloney. As Sagan warned his daughter, it can be difficult to see the truth when it conflicts with what we want it to be.

We are often our own worst enemies when it comes to spotting untruths. For example, consider the large number of people who believed that cyclist Lance Armstrong raced clean even as evidence mounted that he had doped. Until his ultimate confession on Oprah, no one knew with certainty if Armstrong had taken performance enhancing drugs, but there were enough red flags to at least raise eyebrows. And yet an impressive number of Armstrong’s fans remained confident of his innocence up until the moment of his confession.

Armstrong’s fans simply didn’t want to believe that their champion and hero could cheat. So in the absence of a hard confession or conclusive test results, they chose to ignore the deductive logic and inductive evidence that cast doubt on their hero. It is this kind of self-imposed delusion that often thwarts us.

People have their own set of stories, biases, emotions, heuristics and identities that guide their lives, all of which have the potential to cloud their ability to detect baloney. If we don’t want to see something, we usually won’t. But we need to be aware of this tendency if we truly want to be able to spot baloney.

Baloney Detection Framework

Over the years, our team has collectively interviewed and interacted with thousands of management teams. These interactions have taught us many lessons on baloney, its nuances, and how it is spewed. Over time, we have rolled up these lessons into a two-pronged framework that we call our “Baloney Detection Meter.”

Our framework is straightforward. In order for a statement to be considered valid, it must pass through two tests: deductive logic and inductive evidence. If a claim fails either one of these tests, a red flag is raised.

The first test is deductive reasoning: does a claim make sense based on logic?

Deductive reasoning is a process in which a conclusion is based on the concordance of multiple premises that are presumed to be true. In this type of reasoning, we draw specific conclusions based on general observations. It is sometimes referred to as top-down logic. An example of deductive logic is:

A: All men are mortal.
B: Socrates is a man.
C: Therefore, Socrates is mortal.

In this case, the conclusion “Socrates is mortal” flows naturally from the premises that all men are mortal and that Socrates is a man.

Our second test is inductive evidence. It is not enough for claims to appear logical; they must also be supported by facts. If deductive reasoning is sometimes referred to as “top-down” logic, inductive reasoning is “bottom-up.”

Inductive reasoning refers to gathering specific evidence about a situation and then drawing broader “probable” conclusions based on that evidence. For example, you have seen animals eat at the zoo, at the pet store, and on the farm. You may not have seen all the animals in the world eat, but you have seen it enough times and have learned enough about biology to conclude that all animals must eat to survive.2

The Gowex Case

It was a cold Friday in January and our research team had ordered pizza and piled into the training room. One by one, we talked about stock ideas that had the potential to be “ten baggers.” Eventually, Siying Li, an equity analyst, started talking about a Spanish technology stock that she had just found. The company was called Gowex and its margins and growth were both impressive. A startup darling, its management had won numerous innovation awards over the previous year.

A few weeks later, Siying and fellow colleague and portfolio manager, Christian Deckart, called Gowex’s management to learn more about the business. Over an hour and a half, the two probed the company’s CFO on their business model and how they made money.

Despite many questions, the pair never managed to wrap their heads around the simple question of how the company was so profitable. In fact, they were so confused by the business model they didn’t even get to the many other types of questions we normally ask in these meetings.

After the call and some additional analysis, it was decided that Gowex just didn’t make sense—the company was either the best thing since sliced bread or a fraud. Since we didn’t understand their business model, and we won’t invest in businesses we don’t understand, the research on Gowex was dropped.

Seven months later, Gotham Research released a report hypothesizing that the company had fabricated over 90% of their earnings. Gowex’s stock was suspended from trading—Siying and Christian’s skepticism had helped us to dodge a bullet.

Gowex is a practical example of how the baloney framework can be used in investing. Even though the company’s financials supplied inductive evidence that it was a good business, the deductive test was throwing out red flags. Something didn’t sit right with Siying and Christian. Since they couldn’t figure out why the company was so profitable, they didn’t have any reason to believe that the company would be consistently wealth-generating in the future.

Had we continued with further research on Gowex, the company might have also failed the inductive evidence test. We would have put the company through our forensic accounting process, which might have shown inconsistencies in the accounting. We would have also looked for independent corroboration of the facts by speaking with employees, analysts, competitors and industry analysts. These conversations likely would have raised red flags.

Turning the Meter on Ourselves

Of course, the real challenge with baloney detection is turning the process inward on ourselves. Arguably, this ability is even more important than discovering falsehoods from others because our own baloney is sneakier and possibly even more dangerous.

Self-deception can run deep. While we may think we view the world through an objective lens, our perspective is coloured by our collective experiences and associations. Our personalities, egos, and attachments to the stories and identities we have created for ourselves create blind spots. Our consciousness is a live movie and we are its director. Self-delusion is common place.

Luckily, there are strategies we can adopt to unearth our own misconceptions. And while the scope of that discussion goes beyond what we can address in this paper, we believe there are two strategies important enough to mention here.

The first strategy is to develop a greater awareness of our thoughts. Most of us run through life without actively paying attention to our thoughts; they just seem to “pop” into our brains and then disappear. A lot of times, we treat these thoughts as the truth. But while they may certainly seem “real” in the sense that they are occurring, perhaps even producing a physical reaction in our bodies, that doesn’t mean they represent objective reality.

For example, most people seem to have at least some negative stories or limiting beliefs about their potential. They tell themselves all sorts of reasons why they can’t do something, like “I could never do that,” “it’s just not me” or “I’d never be good enough.” Frankly, a lot of this self-talk is our own baloney. In our team’s experience, people’s true potential is generally way beyond what they believe to be possible.

An active awareness of our internal thought process is the most critical step in detecting our own baloney. Once we are self-aware, we are able to apply deductive and inductive reasoning to our thinking. Developing this kind of self-awareness can be achieved in numerous ways. One popular approach is meditation. Another is to journal.

The second strategy is to systematically build feedback loops in your life. You might not see your actions clearly but often someone else usually will. Moreover, hard performance data rarely lies. Systematically implementing feedback loops with other people or with objective measures of performance can be a valuable tool.

For example, our research team at Mawer has found success in implementing a 60 day review process. Every 60 days, we sit down with one of the three research leads and check in on how things are going. This process is invaluable for the team as it enables the frequent provision of feedback and allows team members to course-correct rapidly.

Final Thoughts

Carl Sagan once wrote that “the brain is like a muscle. When it is in use we feel very good. Understanding is joyous.” It is in this spirit that we approach the detection of baloney.

We seek to peel away the layers of baloney in our lives, not only because we must (for good decision-making) but also because we can. The pursuit of truth is a joyous and rewarding process in and of itself.

However, even if the reader does not share in the same zeal for pursuing objective reality, he or she will likely still appreciate the desire to make the most out of his or her life—or at the very least his or her investment portfolio. And in these endeavours, the detection of baloney is critical.


1 Sagan, Sasha, Lessons of Immortality and Mortality from my Father, Carl Sagan. New York Magazine. April 15, 2014.
2 Chris Clause, “What is Inductive Reasoning? – Examples & Definition” study.com, Web. October 17, 2015.

This post was originally published at Mawer Investment Management

Total
0
Shares
Previous Article

Cliff Asness: Investing success isn't about genius ― it is about having incredible fortitude

Next Article

What is the Fed Supposed to do with 0.2% CPI?

Related Posts
Subscribe to AdvisorAnalyst.com notifications
Watch. Listen. Read. Raise your average.