The Art Of Marketing and eCommerce Series

Logical Fallacies In Online Business Part 8: Authority Bias



“The real damper on employee engagement is the soggy, cold blanket of centralized authority. In most companies, power cascades downwards from the CEO. Not only are employees disenfranchised from most policy decisions, they lack even the power to rebel against egocentric and tyrannical supervisors.” – Gary Hamel

The Authority Bias is a logical reasoning fallacy where you defer to any position of authority to either dismiss or confirm evidence. The thought process essentially follows this reasoning pattern:

  1. Person X is an authority in a particular field.
  2. Person X says something about a  topic in their respective field.
  3. Person X is probably correct because they’re an expert.

Now everyone of us has fallen for this before, where we put full faith into someone’s information simply because they are believed to know better. The scope of this, as you may well assume, is vast: Car Mechanics, Economists, Meteorologists, CEO’s, Financial Advisors, Fitness Trainers, Dentists, Doctors, Teachers, Marketers, Developers, and finally but also the most alarming and deceptive of all… ourselves. When we believe ourselves to be the expert, we fight doubly hard to prove that fact to be true. So much so that to keep that idea intact, we’ll disregard information, ignore facts, and otherwise dismiss those that might actually have something worthwhile to consider. We’ll argue to death ideas that don’t agree with our own mainly to keep our ego’s from deflating.

We Are All Susceptible

It would be wise to remember that some of the greatest minds in history have fallen for this logical fallacy and is a rather prevalent problem within the scientific community.  Albert Einstein himself argued against Niels Bohr’s theory of the Atom known as Bohr’s Model. Einstein rejected Bohr’s interpretation, fought against it vehemently and at the time had the clout to rally support around his rabble rousing throughout the scientific community. His whole reasoning and arguments were based on the fact that he simply didn’t like Bohr’s Model mainly because he was unable to imagine it. Having stated (in regards to Werner Heisenberg’s Matrix Equations which Bohr’s Model was based on) that, “I, at any rate, am convinced that God does not throw dice.” In the end Einstein was utterly wrong in his arguments while Bohr’s Model is still in use today. This should go to show that even the greatest minds can fall for this fallacious trap, especially considering that if there were ever a supposed expert in any field, it would have been Albert Einstein and Physics.

The Authority Bias is a sinister logical fallacy for one simple reason: If you don’t know something better yourself you will likely trust the advice or information from someone who is considered an expert in that field. If you are considered the expert, the ego can often muddle up any proper reasoning needed to see things more accurately, as was the case for Einstein and Bohr. The Authority Bias may even take affect despite times when you know conflicting facts to be true, or when actions that you believe to be immoral or unethical shouldn’t be taken. This was essentially the idea behind the Milgram Experiment done throughout the 1960’s.

The Perils Of Obedience

The Milgram Experiment was a series of physiological experiments which measures the willingness of study participants to obey an authority figure who instructed them to perform acts conflicting with their personal conscience. The experiment involved three individuals; one being the Instructor (Authoritative figure), the subject volunteer (Teacher), and another fake subject who was pretending to volunteer (Learner). The experiment, in short, essentially went as thus:

The real volunteer (Teacher) was to administer shock therapy to the fake volunteer (Learner), the intensity of which was to be stated by the Instructor(Authority Figure). The fake volunteer was to be given a list of test questions, and if the answers were wrong, the real volunteer was told to shock the other participant. Prior to the test the real volunteer was given sample electric shocks to experience the pain to be inflicted first hand. For each question answered incorrectly, the voltage would increase by 15-volts. The real volunteers were told that the voltage could increase up to 450-volts, which was not likely enough to kill anyone but would inflict tremendous pain.

Originally it was believed that out of 100 participants, only about 3 would be prepared to inflict the maximum voltage. Remember that with each wrong answer and voltage increase, the fake participant would begin to writhe in pain, cry, and beg for them to stop as the electricity was administered. In the end, 65 percent of participants administered the full voltage. At times the real volunteers may have questioned the instructor whether it was safe, they would ask if they could stop, but overall, at the instructors (Authoritative Figure) behest, they would continue on despite knowing they were causing serious pain to the other participant. Milgram summarized the experiment in his 1974 article, aptly named, “The Perils of Obedience“.

Healthy Skepticism and Diligent Caution

Now consider how many things you’ve done, or believed, in the past because someone of supposed authority influenced you. The idea to remember here is not that you agreed with them, or whether they were actually right or wrong, but whether at any point you seriously doubted or questioned what they said. Remember, anything you’re told can be taken as fact without knowing better yourself or having the due diligence to understand or prove it otherwise. The Authoritative Bias isn’t a problem because we listen to people and put trust in them, it’s a problem because we don’t question or second guess what we’re told. Everyone should posses a healthy amount of skepticism and that is said not to be cynical, but cautious.

When you’re listening to an authoritative figure, there are primarily two problems that arise which compromise logical thinking: The first is that often their performance doesn’t match what you’d expect to see from an “expert”. Of the thousands of active economist’s, not a single one of them was able to accurately predict the 2008 financial crisis or it’s outcome. Meteorologists have been studying the atmosphere starting with Aristotle around 340 BCE and still, even with modern day equipment, are often unable to accurately predict how the weather will behave. More Scientific theories are disproved over being verified (as is the nature of the scientific process). For Lawyers you always have two sides, and one ALWAYS loses. This means that naturally, lawyers as a whole can only win 50% of their cases. However for each case they will likely exude the same confidence that they will win as they typically always do.

What other profession, where you’re expertise is relied upon, has such terrible odds and the people are still able to remain authoritative figures? I can’t even begin to touch on all of the forms of alternative medicine available by various “doctors” (This is not to say that all alternative medicine is bunk, just that much of it lacks sufficient evidence to prove its use worthwhile.) Point being here that people don’t often check track records. They say the proof is in the pudding… but at that point you’ve already questionably filled your mouth. Wouldn’t you rather know prior to testing whether something is legit and trustworthy or not?

The second is that people of authority often crave recognition and find ways to reinforce their status or position. This can cloud their own judgment skewing truths that fail to bend to their beliefs. Think of all the ways that Authority is displayed nowadays to ensure you defer to others expertise: Diplomas, Certifications, Testimonials, Awards, Social Proof… the list goes on. Beyond various accessory forms of proof, people will often purposefully dress the part to ensure no doubt exists: Doctors wear white coats, businessmen wear suits, kings wear crowns, military personnel wear chevrons and badges, so much is on display to highlight one’s expertise. But as with any important document, you must be sure to read the fine print and look in between the lines. Don’t judge the book by it’s beautifully ornate cover, don’t trust strangers opinion about the book, and conversely, don’t dismiss the book because others do or because the cover is worn and tattered.

It is the authoritative figures duty to convince you they know better, both in times when they actually do and even when they don’t. Some would say when you don’t have the credentials, you should fake it till you make it. A larger problem still, beyond people being dishonest is that there are some convincing specters with scepters out there whom you would be hard pressed to deny or find fault with. Again, who would have questioned Einstein in his arguments? In fact very few did and most rallied behind him. In the end, you must carefully and diligently deprogram the Authority Bias for both yourself, but also your organization.

So how can we do this? In fact it’s rather easy, and is a similar solution to many logical fallacies: question everything you can about a supposed expert prior to putting faith into them. Don’t listen to what they say, or what others opinions are of them… just take a look at their history. Check their track record, and don’t be persuaded or convinced otherwise until you have convinced yourself. Now you would be hard pressed to do this for every consultant, expert, or professional you run into whilst running your business. But when there is something critical occurring, or when putting your faith into someone could have dire consequences… be diligent and be smart. There’s no loss in turning someone away or disregarding their information because you believe it to be inaccurate (save for the opportunity loss when you’re actually the wrong one). Doubt is a powerful feeling, perhaps one of our most powerful. It single handedly keeps us from doing something stupid, and many times while we’re in the midst of doing it. Search for any shroud of doubt you can when listening to an authoritative figure and if you come up empty handed, you can rest assured you did the best you could  to protect yourself. As an expert statistician I can say that some of the time… this works 100% of the time.