ArtsAutosBooksBusinessEducationEntertainmentFamilyFashionFoodGamesGenderHealthHolidaysHomeHubPagesPersonal FinancePetsPoliticsReligionSportsTechnologyTravel
  • »
  • Religion and Philosophy»
  • Atheism & Agnosticism

Science & Belief: Why We Believe in Gods, Ghosts, UFO's, Conspiracies and Everything Else

Updated on October 22, 2016
TheGutterMonkey profile image

Nerd, cinephile, TV-junky, research-loving, left-leaning, science-fiending, atheist from the gutter. Follow me on Twitter @TheGutterMonkey.

Are you a believer?

Regardless of how rational and pragmatic you may think you are — or immune to silly superstitions and wild ideas you might believe yourself to be — the truth is that none of us are Star Trek Vulcans. We're each just as susceptible to illogical thoughts and strange beliefs as the next guy. In fact, we're sorta hardwired that way; so don't feel bad.

Our ability to see patterns, connect the dots, and believe things that there's no evidence for is actually one of the many things that makes us the extraordinary creatures we've become. It's led to amazing discoveries in science, beautiful works of art, and even resulted in the very survival of our species. But, nevertheless, everything has its downsides.

For every Einstein or Darwin who recognizes patterns in nature that turn out to be real, there's always millions of us on the other side of the spectrum who put together patterns that turn out to be not-so-real. For, alas, while the scientific method and meticulous scrutiny are great ways to avoid fooling ourselves, we unfortunately don't apply that type of critical thinking to all things. It takes a special effort not to be duped, and sooner or later a false belief will slip through the cracks of our minds. But that's okay. Einstein and Darwin were wrong sometimes too, you know? We're all only human. And beliefs, whether wrong or right, are a major part of our humanity. But perhaps if we know why we fool ourselves, we'll then be better equipped not to be fooled again. That's our goal today.

Using several books and studies related to the psychology of beliefs and the evolution and study of the brain (sources are cited within the article), I've compiled a list of some of the biggest reasons we believe in so many strange things. From visitations from extraterrestrial intelligence's, to our chosen political parties, gods, ghosts, conspiracy theories, afterlife's, and a whole assortment of fears, hopes, and superstitions, these are some of the primary reasons beliefs are ingrained in our everyday lives and why they persist even when they're proven wrong.

14.) We Evolved to Find Patterns in Everything

How we evolved to believe wacky and not-so-wacky things

Have you ever looked up at the sky and saw a familiar shape in the clouds? Sure you have. We all have. This is an example of how our brains find and interpret patterns, associating them with images and ideas that we already have in our minds. If you were to believe that this familiar pattern in the clouds wasn't merely happenstance but actually meant something (much like our ancestors thought), this would be an example of Apophenia, also known as Patternicity (the latter is a term coined by science historian Michael Shermer), which is our tendency to find meaningful patterns in both meaningful and meaningless things; subsequently infusing those patterns with intention and agency (beings other than ourselves).

This is a double-edged sword of our brains that's been both one of our greatest instruments in success as a species and one of our biggest downfalls in reasoning. This is all due to our evolution, and how our belief in unreal things is a by-product of our need to find patterns, to "connect the dots", and to believe those connected patterns are real in order to make predictions for the future that we can plan and prepare for.

The evolution of pattern recognition

An example of natural selection favoring a species who's more likely to find patterns, and believe that random occurrences have meaning, would be if our ancient ancestors heard a rustling in the bushes. While it could be nothing more than the wind, it could also be a dangerous, unseen predator. What to do?

Those ancestors who were more prone to connecting a pattern between the rustling bushes and a predator would often flee the scene whether they were wrong or correct in their assumptions. When wrong, they've had an error in cognition that caused them to find a meaningful pattern in a meaningless thing; falsely believing there was something there when there wasn't (this is called a False Positive, or believing something is real when it is not). But that’s okay, because the cost of this incorrect belief was minimal — they may have run away for no reason, but they were still alive.

On the other hand, those ancient people not so prone to finding patterns in things would also be wrong on occasion. But they wouldn't be so lucky. Their cognitive error is what’s known as a False Negative, meaning they believed something was not real when it was. Their inability to find and believe patterns, and make leaps of faith based on those patterns, would inevitably lead to their death, sooner or later.

What’s happened above is a prime example of natural selection in action. Where only those more likely to find patterns (both meaningful and meaningless patterns) survived to pass on their genes to the next generation. These genes eventually led to us and our innate inclination to make more False Positive errors than False Negative errors, resulting in a higher likelihood to believe that most of the patterns we hear and see are real, even when they are not. Because the cost of being wrong as a believer was less than the cost of being wrong as a non-believer.

With our innate ability to find patterns in the world and our natural inclinations to associate these patterns with meaning, intention, and agency, we already have the groundwork laid for most of our wildest beliefs.

Examples of Pattern Seeking

  • Believers in aliens associate unidentified objects in the sky with flying saucers (and suspect the face shaped rocks on Mars to be evidence of extraterrestrial life).
  • Theists are more likely to find religious symbols and meaning in natural, random patterns, like burns on a piece of toast, natural phenomena, a stain on a glass panel, etc..
  • Conspiracy theorists connect random or unrelated events and associate them with larger, sinister meanings and government coverups.
  • If gamblers hit three "greens" in a row on roulette, they may see it as a pattern instead of a random event, thus determining their next move based on this; and may even perform rituals before they spin (knocking on wood, blowing on dice, etc.) because they'd won previously after performing them.
  • Those who believe in the paranormal are more apt to assume a noise heard or movement made in the dark is a spirit.
  • People who have suffered visual and auditory hallucinations associate the experiences with sentient beings outside of their body (aliens, spirits, gods,etc.).

Sources and Related Links

Agent Detection
The National Center for Biotechnology Information
Apophenia
Patternicty, Science, and the SQ Requirement
Priming (psychology)

Learn more by reading...

13.) Our Brains Chemically Reward Us for Belief

The natural high associated with our addiction to belief

Over hundreds of thousands of years of natural selection, our brains have become hard-wired to make us gullible to believing things in order to survive. But what exactly is it our brains are doing in there?

Any behavior that is reinforced by the brain tends to be repeated. For doing things that help us survive and spread our genes, our brains will reward us with squirts of stimulating chemicals that feel really good and make us want more. A couple of obvious examples of this include eating, having sex, and finding patterns that can warn us about dangerous predators and weather. The natural highs we get from doing these things are necessities for survival that the brain has an ability to take note of and remember. Remembering these behaviors, and the rewards we receive from performing them, is why we continue to perform them in the future.

For belief, the most important reward we receive is the one we get from finding patterns, and the apparent chemical responsible is dopamine. Dopamine is directly connected to learning, motivation, and reward, and is released whenever our brain believes we should take strong note of our current behaviors; it’s also what seems to control the brains nucleus accumbens (also known as our reward and pleasure center), which is the same place implicated in the highs derived from both orgasms and cocaine. Furthermore, dopamine is thought to increase the brains signal-to-noise ratio, meaning that it causes us to find more meaningful patterns in meaningless noise. And if we think a pattern has a meaning, that is a pattern we belief.

What you specifically believe, of course, is due to your environment and other factors mentioned in this article. But the disposition to believe, as shown by thousands of studies of separated twins (see the Examples and Related Links below), is likely to be genetic and heritable.

Examples and studies showing how chemicals affect how we believe

  • Functional scans of Japanese monks show that different types of meditation stimulate different areas of the brain, namely parts of the prefrontal cortex and the parietal cortex. Religious belief is also associated with reduced reactivity of the anterior cingulate cortex (ACC), as is political conservatism. Although the causality of these correlations isn’t clear, it’s interesting that taking initiatives, by contrast, is associated with increased activity in the ACC.
  • The EEGs of Carmelite nuns have shown marked changes during mystical experiences when they felt they were at one with God. In a state like this, individuals may also feel as if they have found the ultimate truth, lost all sense of time and space, are in harmony with mankind and the universe, and are filled with peace, joy, and unconditional love. Neuropharmacological studies show how crucial the activation of the dopamine reward system is in such experiences.
  • Exploring the neurochemistry of superstition, magical thinking, and belief in the paranormal, Peter Brugger and Christine Mohr, at the University of Bristol in England, found that people with high levels of dopamine are more likely to find significance in coincidences and pick out meaning and patterns where there are none.
  • Alzheimer’s disease, which is related to lower levels of dopamine, is linked to the progressive loss of religious interest. The more slowly it progresses, the less religiousness and spirituality are affected. Conversely, hyperreligiosity is associated with fronto-temporal dementia, mania, obsessive-compulsive behavior, schizophrenia, and temporal lobe epilepsy; a number of these disorders are known to make the dopamine reward system more active.
  • Geneticist Dean Hamer, wanting to know if there was a family genetic connection to an addictive personality, gave over a thousand experiment subjects a battery of psychological questionnaires, including ones about spirituality. He found that those who were especially "spiritual" had a dopamine-boosting version of VMAT2 (a gene which regulates the flow of serotonin, adrenaline, norepinephrine, and dopamine).

Sources and Related Links

Uncovering the Science Behind Belief
Sage Journals | Neural Markers of Religious Conviction
Dopamine, Paranormal Belief, and the Detection of Meaningful Stimuli
The Brains Reward System
Alzheimer's & Dopamine

Learn more by reading...

12.) Our Need for Control and Making Sense of Things

How we spin elaborate tales to comfort ourselves when feeling powerless

A further example of our predisposition to seek meaningful patterns occurs during moments of risk, danger, and when we're faced with situations in which uncertainty abounds and we have no control. If we build a good enough story around an event we can't comprehend, it will alleviate our minds of the stress of not knowing and not being able to understand "how something like this could happen".

When in control of a situation we think clearer and make better decisions; lack of control makes us anxious, causing us to urgently and instinctively seek out patterns to make sense of things (identifying a cause-and-effect pattern in a sequence of events) so that we can regain a feeling of control and comfort in knowing what's going on, and even develop predictions for the future. Unfortunately, due to our haste, the patterns we find are often patterns that aren't really there (leading us to perceive false correlations, see imaginary figures, form superstitious rituals, and embrace conspiracy beliefs, among other things). And when belief arrives before reason, it can be very difficult to talk ourselves out of the belief afterwards.

This isn't just a half-baked idea, either. Numerous studies have been done on the effects of control and pattern perception (see the examples and related links below), all of which with very clear and consistent results. As anthropologist Bronislaw Malinowski noted in his studies:

"We find magic wherever the elements of chance and accident, and the emotional play between hope and fear have a wide and extensive range. We do not find magic wherever the pursuit is certain, reliable, and well under the control of rational methods and technological processes. Further, we find magic where the element of danger is conspicuous."

Examples of how lack of control leads to pattern-seeking

  • In anthropologist Bronislaw Malinowski's studies of superstitions among Trobriand Islanders in the South Pacific, he found that fishermen's levels of superstitious rituals rose the further out to sea they sailed, where the conditions grew more uncertain.
  • A 1977 study found that if you show parachute jumpers who are about to leap out of a plane a photograph of random television noise ("snow") they are more likely to see nonexistent patterns than if they see the same photos earlier.
  • A 1994 study showed that anxious first-year MBA students are far more conspiratorially minded than their more secure second-year colleagues.
  • A 1942 study found that when ambiguous images are shown to both hungry and satiated people, the hungry were far more likely to see food in the images.
  • Many conspiracy theories make up causes and motives to events that are more rationally seen as accidents and random acts of violence in an attempt to bring order, easier understanding, and a sense of control to that which has no order or control. This applies also to events such as the JFK assassination and 9/11 which already have understood causes, because no one wants to think such monumental, world-altering events could be caused by such simple reasons and regular people; it's more comforting to attribute the events to equally powerful government entities. (Even indirect control is desirable; i.e. if we can't control the weather, understanding and being aware of it at least helps us prepare for it — "knowing" is another form of control.)
  • We develop elaborate creation myths, sun gods, rain gods, war gods, and gods of the ocean. We believe we can communicate with our gods and influence their behavior, because by doing so we gain some control and impose some order on the chaotic mysteries of the world.

Sources and Related Links

Magic, Science, and Religion (New York: Doubleday,1954)
Lacking Control Increases Illusory Pattern Perception (includes citations of further studies conducted)
ScienceBlogs

Further Reading...

11.) Our Need to Know and Accept Easy Explanations

Another big reason (not unrelated to our need for control) that we can't be content with being ignorant to things we don't yet have the answer to has to do with the fact that while the human brain is extraordinarily good at posing questions, people have an extremely low tolerance for ambiguity. That is to say, we really can't stand not knowing things, and that makes us very susceptible to prematurely accepting the first answers that we can quickly comprehend. Regardless of if they're the real answers or not.

Religion, for example, was our first attempt at physics and astronomy. We had huge questions about the natural world and, in the absence of concrete answers, we accepted whatever magical or superstitious ideas we could make fit. Even today, when scientists better understand things, it's still difficult for most of us to wrap our minds around the complex workings of biology, neurology, physics, and the other technical, esoteric inner-workings of the universe; causing many of us to continue to seek alternative explanations that we can more easily comprehend. The much simpler notion that anthropomorphic gods and other magical beings and occurrences are running things is much more alluring because its much easier to understand.

We are unable to accept “I don’t know” because we cannot turn off our instinct to see patterns and to discern effect from cause. We demand that there be a pattern, that there be cause and effect, even when none exist. By making up answers to dull the sting of ignorance, we fool ourselves into thinking we explain the world.

Examples of our need to know and accept easier explanations

  • With big, unanswered questions such as "Where did the universe come from?" and "Is there a purpose to life?", billions of people have confident beliefs, even though it's not currently possible to have answers to these questions. Our inability to mentally fathom, let alone toy with, such concepts as infinity and reasonless existence makes us uncomfortable (see: Our Need for Control) and leads us to conjuring up more easily understood ideas such as anthropomorphic creators (or designers) who put existence into place; which gives us a definite and desirable explanation for why we are here (see: We Believe What We Want to Believe). Ironically, however, once we attribute creation to an intelligent being, we become surprisingly content with the idea that this creator is infinite and without reason himself.
  • Ancient cultures all over the planet would use the stars in the sky as a calender to predict weather patterns; inevitably leading them to conclude that they sky spoke messages to them personally about the future. This caused them to interpret unexpected passing comets as signs of famine, war, disease, and even death to come in their future.
  • While often the official stories for events such as the JFK assassination and 9/11 are the simplest explanations, they may not be the easiest explanations for people to accept. The fact that such monumental, world-altering events could be the result of random lone gunmen and ragtag religious extremists can be difficult for many to fathom. It's difficult to accept that we, our government, country, and world can be so vulnerable to such average people and groups. The easier explanations thus would be that such major tragedies must therefore be due to exceedingly major, super-powerful, unimaginably intelligent conspirators.

Sources and Related Links

Why We Need Answers
Comets in Ancient Cultures
Argument from Incredulity

Further Reading...

10.) Our Tendency to Look for Confirmation Biases

How we filter evidence and information to make it fit with what we already believe

Anyone who's into discussing (or arguing about) subjects such as politics, religion, conspiracies, or any other issue that puts one side of an argument against another, has probably, at some point, fallen victim to the inevitable confirmation bias.

A confirmation bias is the tendency to search for, interpret, or recall information in a way that confirms the beliefs that we already have in the first place. And either ignore or dismiss any information that conflicts with what we already believe. In short, we tend only to look for and remember information that supports our arguments or opinions. That is, we filter and interpret information in ways that reinforce our previously established beliefs and expectations.

Characteristics of using confirmation biases include

  • An unwillingness to accept the validity of evidence that defies your currently held beliefs.
  • Placing greater weight or emphasis on “facts” that support your assumptions, excluding contradictory evidence.
  • Actively seeking out only information that “proves” your point.
  • Selective (and often incorrect) recollection of events, facts or statistics so that they support your beliefs.
  • Actively seeking out answers to questions only from sources (authors, news outlets, websites, etc.) which you already know you agree with.

If you're a liberal, for instance, then you're likely to get your news primarily from a liberal source; if you're a conservative, then you'll use a conservative source for information, because it more closely adheres to what you already believe.These, and other forms of confirmation bias have the effect of strengthening potentially bad beliefs and ideas (and our confidence in them) thus making it more difficult to change our minds later on, in the light if further contradictory evidence.

This isn't unusual and it's not something that only one group does. It's a basic human flaw in which all of us are susceptible to. It's an unintentional self-deception, possibly brought on by, among other things, our desire to believe particular things (wishful thinking), where people demand a high standard of evidence for unpalatable ideas and a low standard for preferred ideas. And possibly evolved as either a by-product of, or a direct way to assist in our ability to reason. For more information on the evolution of our confirmation biases, check out the video link below.

Sources and Related Links

Confirmations Bias
Examples of Confirmations Bias
5 Cognitive Biases & How to Overcome Them On Your Landing Pages
Video: Confirmation Bias and the Evolution of Reason

9.) We Believe What We Want to Believe

If it seems too good to be true, we'll probably believe it

As Fox Mulder so often said on The X-Files: I want to believe. We all do. Maybe not in ghosts, gods, or aliens, but there's always some opinion or some belief that each of us wants to be true much more than we want it to be false. And, oftentimes, these hopes and desires affect what we eventually chose to believe is real.

This is called Wishful Thinking (yes, it's an actual thing). It's the formation of beliefs and the making of decisions according to what might be pleasing to imagine instead of what's actually supported by evidence, rationality, or reality. As mentioned in the "Confirmation Biases" section above, wishful thinking causes people to demand much more evidence to be convinced of ideas they don't like (if they can be convinced at all), and not need much convincing at all for ideas that they prefer. Oftentimes we use confirmation biases in order to grow more confidence in these desired beliefs (e.g., paying attention only to evidence that supports what you want to be true).

This may appear to be an obvious and easily avoidable reason why we believe, but you may be surprised at how susceptible all of us are to it. Wishful thinking doesn't only apply to desirable happenings or pleasurable occurrences, but can also apply to our desire to confirm beliefs that may not be pleasurable (Doomsdayers, for instance, may not desire for the world to end, but they do desire to have their beliefs confirmed). This is all the more reason that we should remain especially skeptical, and demanding of evidence, when we're presented with information that has a special appeal to us. Because as the old adage goes, if it seems to good to be true, then it's probably false.

Examples of how wishful thinking can affect us

  • Desires in an afterlife, where we continue living (in some state) after our bodies have died. This is often connected to religious beliefs or beliefs in ghosts. Common arguments include, "I can’t imagine it all just ends when we die. Therefore there must be an afterlife!"
  • The desire that we aren't alone in our bodies, our world, or our universe. This desire leads to such things as the belief in extra terrestrials, all-knowing governments, and gods. These ideas can be directly related to agent detection, which is the inclination for animals and humans to presume the purposeful intervention of a sentient or intelligent agent in situations that may or may not involve one. This agent detection correlates with our natural inclinations toward pattern-detection, and is directly supported by research in cognitive neuroscience.
  • The belief that life and certain meaningless events in life happened for a reason. A common argument used in this scenario is, "If it has no meaning, then what's the point? Therefore there must be a meaning!"
  • While people trust science, consensus opinions in the scientific community, and the scientific method everyday in their use of technology, medicine, biology, etc., they will sternly state that the science and scientists are wrong or corrupt when the science contradicts what they want to believe.

Sources and Related Links

Logical Fallacies
The Valence Effect
The Wiley Online Library
Scientific America | Why People Believe Invisible Agents Control the World

Learn more by reading...

8.) Environment, Family, and Community Affect Our Beliefs

How our chosen beliefs perfectly match those of our culture and family

Probably the most obvious source of many peoples particular beliefs rest in their environment. We are, after all, products of our environment. In our earliest, most impressionable years of life, when most beliefs are made concrete, it's our friends, family, teachers, and acquaintances who are our biggest influences in guiding us into who we become. Subsequently, it's their beliefs and ideas who we are most likely to adopt.

If you're born into a Jewish family, for example, then you're highly likely to develop a Jewish religion; if you're born into a Christian family, then you will likely develop a Christian belief; a Muslim family, a Muslim belief; a Mormon family, a Mormon belief; and so forth. Likewise, if most of your acquaintances and friends are republicans or democrats, you will likely take on whichever political belief is most prevalent in your particular circle. In communities where a belief is especially prevalent, the belief appears to be especially true due to the individuals lack of contact with opponents with different beliefs. The validity of the belief is thus accepted as a fact simply because it's the norm.

And while it's true that an individual can come out of this cycle (i.e., an individual can "stray from the flock", so to speak) this is a relatively rare occurrence. Typically, once a belief is established, especially when established at a young age, it becomes difficult to let go of. Thus resulting in the believers attempts to confirm their established beliefs as true through rationalizing and the use of confirmation biases (see above). Furthermore, communal reinforcements arise to further strengthen our beliefs.

What is Communal Reinforcement and how does it affect us?

It's not just what motivates us to believe wacky and non-wacky things that's important. As mentioned above, many of our beliefs are formed at more impressionable ages, when we don't know any better than to question them. But why do the beliefs persist even after we've grown older?

One such reason is due to Communal Reinforcement, which happens when a concept or idea is repeatedly asserted in a community, regardless of whether sufficient empirical evidence has been presented to support it. Over time, the concept or idea is reinforced to become a strong belief in many people's minds, and may be regarded by the members of the community as fact. The result of communal reinforcement is that people may accept something as true, without need of questioning, simply due to the idea that "if this many people believe it, then it can't be wrong."

Communal reinforcement can affect both small groups and entire nations.

Examples of communal reinforcement include

  • Rumors and urban legends circulate around, through word of mouth, for so long and have been so ingrained in our communities and society that they are generally accepted as being truths when there is actually no physical evidence or direct eyewitness testimonies to support them. Beginning with "I read somewhere that..." or "I heard from someone that..." the rumors eventually become "I know that...". Bubble gum taking 7 years to digest, putting metal in a microwave causing it to explode, and the idea that people believed the world was flat in the time of Columbus are a few examples of false rumors that are generally accepted as true because so many people believe them.
  • Myths having to do with miracles and religions are so prevalent within societies and are shared and reinforced so much, that millions of people believe in their veracity, regardless of how outlandish the claims are. I.e., if one person were to believe many of these claims, they would be considered insane, but when many believe them, the same claims appear to be credible.
  • Due to the vast numbers of non-scientific supporters in creationism, many people accept, without question, the idea that there is a large ongoing debate in the scientific community as to whether or not evolution is a real, established occurrence. In reality, there is no such debate going on within the scientific community.
  • People in a religion often have their beliefs reinforced by other members of their church whom they surround themselves with; conspiracy theorists tend to gather with and talk to mostly other conspiracy theorists; UFO abductees attend meetings with other supposed abductees to talk about their experiences. Once they've combined into a group with like-minded believers, segregating themselves from outside influences, these formed communities act as a reinforcement, similar to a confirmation bias, for their shared belief. The belief becomes the norm and the norm is accepted a the undeniable truth.

Sources and Related Links

Groupthink
List of Common Misconceptions
Herd Behavior
The Bandwagon Effect

7.) Our Unquestioning Trust in Authorities

Whether it be our parents, teachers, scientists, priests, police, government, or tribe leaders, we evolved with a natural tendency to defer to authority, listen to and trust our leaders and experts, and follow the rules and dictates given by those above us in social rank. Children, for instance, accept warnings and instructions issued by their parents and other authorities instantly and without argument, which protects them from danger. It's a system that often works out fine. Except for when it doesn't.

Having complete trust in the knowledge of any one person (or "expert") or group of people is never a great idea. Authorities, after all, are just people like the rest of us. Whether they're a scientist, president, Darwin or Einstein, none of us are completely immune to being wrong. Our leaders, experts, and greatest minds are just as susceptible to the same biases and cognitive errors as the rest of us.

With that being said, this doesn't mean that we should simply dismiss the word of experts in particular fields of study. Just realize that simply because they're knowledgeable about a subject doesn't mean that they're always correct in their opinions and assumptions about said subjects.

If we shouldn't trust authorities, then why trust scientists?

People who argue against science often make claims such as, "Those scientists have agenda's too" or "Scientists are just making things up to get more grant money" when scientific discoveries conflict with what regular people believe. If scientists are just as flawed as the rest of us, then why can't these claims be true?

The beauty of science is that its methods are made specifically to prevent our biases and flaws in cognition from fooling us; and it makes sure that there is no one authority on any particular topic. One scientist, for instance, can come up with faulty results due to bias, greed, prejudice, laziness, or simple mistakes and hasty conclusions. But the process of science itself is designed to be self-correcting. If one scientist fails to detect or correct his results, they will be corrected by other scientists, and their conclusions will fail to reach (or even to approach) the level of consensus (majority opinion) among the scientific community in general. So when we trust science, we are trusting the scientific process, method, and consensus; not any one particular authority.

Examples of trusting authority

  • 9/11 conspiracy theorists who believe the World Trade Center was destroyed by explosive demolition often cite Architects & Engineers for 9/11 Truth as evidence that the twin towers did not collapse due to the impact of airplanes. Architects & Engineers for 9/11 Truth is an organization of architects and engineers who dispute the results of the official investigations into the September 11 attack. The mentality here is that “These guys are architects, so they’re right”, ignoring the fact that this organization is relatively small compared to the vast majority of architects, engineers, and groups who have repeatedly debunked (through experiment, empirical evidence, and computer simulations) the controlled demolition claims.
  • Creationists (those at www.AnswersInGenesis.org, for instance) often put their spotlight on a small number of individual scientists who dispute evolution in order to give the false impression that creationism (under the guise of the more scientific sounding “Intelligent Design”) is at all a valid scientific possibility.
  • UFO and alien enthusiasts often give examples of a handful of likeminded astronauts as evidence that our species has come in contact with intelligent alien beings. This is considered one of the strongest bodies of evidence for UFO proponents because astronauts are considered to be of high trustworthiness.

Sources and Related Links

Argument from Authority
The Scientific Method
Scientific Consensus
Skeptoid
Architects Shy from Trutherism

6.) We Depend on Gaps in Others Arguments to Support Our Beliefs

The absence of evidence is not the evidence of absence

Many times (and especially prevalent with conspiracy theorists and deniers of evolution) we support our beliefs not by proving their validity, but, rather, trying to disprove or look for gaps in the scientific evidences and official stories of events presented in arguments that contradict our own. Meaning, when two sides are disputing a topic, we will often disregard the evidence of the opposing side, and focus only on what they have either not investigated, what’s as yet unknown to them, or what’s unknowable to anyone, and use that particular lack of evidence as proof that our side of the argument must be right. I.e., if they cannot explain every minutia about their proposition, then we will incorrectly equate whatever lack of knowledge they have as direct proof that their entire argument is wrong, thus giving us the false impression that our entire argument must therefore be correct.

Formally, this type of argument can be referred to as an Argument from Ignorance or, when it comes to religious usage, God of the Gaps (in which gaps in scientific knowledge are taken to be evidence or proof of Gods existence and/or creationist beliefs). These arguments ignore the fact that some true things may never be completey provable, detail by detail, just as some false things may never be disproved with absolute certainty. Furthermore, looking for (or even finding) gaps in someone else's argument does not make your argument therefor true — one must supply evidence for their beliefs if they want them to be accepted as facts; they can't merely knock out the competition and expect their conclusions to be then accepted as the default explanation.

Examples

  • Believers in spontaneous creation of life (creationism) who deny evolution often try to discredit evolution by pointing out “missing links” in the “evolutionary chain”, ignoring the many transitional fossils and other evidences of evolution which science is already aware of. They then use this “gap” in the fossil record as proof that God created humans and other species. Likewise, religious believers will also use sciences lack of knowledge of what caused the big bang and what chemical processes caused the first life to arise as proof of the existence of a creator who made the universe and life in it.
  • Those who believe in extraterrestrial intelligences will often interpret unidentified objects in the sky as proof of alien visitors when they can find no other explanation of the unidentified object. The fallacy here comes from the fact that they use their lack of knowledge as proof for what they want to believe. A common argument in this instance would be, "Do you have a better explanation of what that light in the sky was?" When someone isn't currently aware of a better explanation, it doesn't mean that there isn't a better explanation.
  • 9/11 conspiracy theorists will often seize on their and their opponent’s ignorance on architecture, engineering, etc., as proof that the World Trade Center buildings collapsed due to controlled demolitions as oppose to planes crashing into them. Putting forth the argument, “If you or I can’t personally explain how they collapsed due to the impact, then it must have been happened due to a controlled demolition.”

Sources

Transitional Fossils

Argument from Ignorance
God of Gaps
The Fossil Fallacy

Learn more by reading...

5.) We Become Too Committed to Our Beliefs

Holding onto beliefs we've invested time in and committed to

Political ideologies, long-held hopes and fears, religions, conspiracy theories, and every other idea we invest a significant amount of our lives believing in, studying, and promoting, is an idea that we’ve committed to. We’ve married it, put our heart into it, made leaps of faith for it, and the longer we’ve held onto it and convinced ourselves we’re right about it, the more we will stand by it and support it, even when faced of irrefutable contradictory evidence.

This is because it’s difficult to let go of things which we’ve invested vast amounts of energy into and so publicly supported for so long of a time. To accept being wrong can seem embarrassing, submissive, belittling, and can result in significant damage to our pride and self-esteem; especially when we’re proven wrong in the face of people we’ve so loudly and vehemently disagreed with for so long. It’s not surprising, then, that when faced with evidence that goes against beliefs and ideas we’re so committed to, we look for ways to rationalize why their evidence is wrong and how our beliefs can still hold up regardless of that evidence.

Once we form beliefs and make commitments to them, we maintain and reinforce them through a number of powerful cognitive biases, including the ones mentioned in this article.

Examples

  • Democrats and Republicans in the USA typically stick by whatever their political party dictates. When listening to talk radio and political commentators on TV, for example, a listener or viewer can almost always predict what the media personalities opinion is going to be in any particular political topic, based solely on what party they belong to; that is, they are sticking by the beliefs of the parties they have committed to, as oppose to objectively looking at and scrutinizing each individual topic itself. Rarely do you see a liberal or conservative going against the belief of the party they belong to or admit that their party is wrong on even one single issue.
  • JFK and 9/11 conspiracy theorists have repeatedly had their many claims debunked by evidence, experimentation, and CGI reenactments of their respective events. Regardless of this, the most committed conspiracy theorists continue to persist in their claims.

Sources and Related Links

Sunk-Cost Bias
The Believing Brain

4.) We Have Selective Memories & Inattention to Probabilities

Have you ever thought about a person, only moments later to have the phone ring and, lo and behold, find that it’s them? Or have you ever gotten a bad feeling and later found out something terrible has happened? Or how about had a dream that turned out to be prophetic? It must be some sort of ESP, right? No. We remember moments like these because they stick out more than the millions of other times that we’ve had dreams that weren’t prophetic, or gotten bad feelings for no reason, or thought about someone and haven’t immediately received a phone call from them afterwards. These are examples of just how selective our memories can be and how little we notice probabilities.

We have an onslaught of random thoughts, worries, hopes, fears, suspicions, and expectations all the time, every day. And while most of the time these feelings amount to nothing, occasionally, due to the laws of probability, these thoughts and feelings will eventually fit with an actual event in our lives. The rarity of the occurrence, though, is precisely what causes it to be so noticeable to us.

After all, it’s the unique, unusual, and rare that always stick out from the bunch. And if the rare coincidences that stick out to us fit in perfectly with what we already believe, we’ll be more than happy to accept those coincidences as being something more than what they really are.

Examples of Selective Memory

  • In an hour reading, psychics and astrologers will make 200 or 300 vague statements. If a person walks away with half a dozen things the psychic got right, they’re ecstatic and walk away with no memory of the times the psychic/astrologer missed their mark.
  • Religious people will happily recall the time their prayers came true and attribute those times to god, neglecting to recall the many times they've prayed for something only to have the opposite, or nothing at all occur.
  • When horoscopes or fortune cookies predict our futures, they use very general and confusing language that may apply to a wide range of things. Those willing to believe will give meaning to and make sense of the predictions by connecting later occurrences in their life to the predictions they could make fit, while forgetting the ones that didn't.
  • When thinking about any of our beliefs, we tend to remember only evidence that supports them and forget evidence that contradicts them.
  • When thinking about global warming, many of us tend to remember our stretches of recent cold days but ignore the long-term trend of the climate becoming warmer.

Sources and Related Links

The Forer Effect
The Skeptics Dictionary
Subjective Validation

3.) We Believe Anecdotal Evidence and Misinterpret Our Experiences

Extraordinary claims require extraordinary evidence

Eye witnesses testimonies and peoples personal experiences are seemingly compelling motivations for belief. If we hear the enthusiastic voice of someone who is absolutely positive that they saw an alien UFO, Bigfoot, the Loch Ness Monster, a ghostly apparition, or had a personal experience in which they saw, heard, felt, or otherwise had contact with a divine being, it can appear to be a very persuasive argument for their belief. After all, they seem so confident. And why would they lie?

The truth is that these people probably aren’t lying. Nor are any of the millions of other people making equally bold claims. But simply because they, or we, feel certain about a belief, does not mean that we’re correct in our assumptions that the belief is valid.

We’re all prone to the same errors in cognition, the same biases, the same ability to misinterpret things, and the same susceptibilities to being fooled and fooling ourselves. We’re all human and humans are fallible creatures who make mistakes. This is why anecdotal evidence (stories recounted by people in support of a claim) is not real evidence for anything at all when it’s not held up by empirical evidence. That’s to say, if a story someone tells can’t be supported by tangible evidence, controlled experiments, and other testable proofs, then the listener of the story (as well as the storyteller themselves) should be very skeptical of the validity of that story being told.

Generally, the more extraordinary an anecdote is, the more extraordinary the evidence for it should be. We should never believe anyone with absolute certainty due to their word alone. Nor should they believe us until we have further corroborating evidence to support our case.

Examples of anecdotal evidence

  • Conveying that your grandfather smoked 3 packs a day all of his life and lived to be 97 doesn't mean that cigarettes can't kill you. This would be an isolated occurrence that doesn't take into account the much larger amount of deaths that were directly due to smoking.
  • Stories about someones cancer being cured after taking alternative medicines can make it appear as if that alternative medicine was the cure for cancer. These stories neglect to factor in the fact that the cancer may have gone into remission on it's own, which cancers do all the time, or it may have been misdiagnosed, or, or, or.... Without controlled experiments proving that the medicine was the cure, and that there was no other explanation, it's wrong to unquestionably believe this anecdote.
  • People who have had near-death experiences often say that they experience a floating or flying feeling, feeling as though they passed through a tunnel with a bright light, or saw loved ones or godlike figures. They then attribute these things with a certainty that there's an afterlife. As compelling as these personal experiences may sound, what's to say they weren't simply delusions brought upon by failing brains that were in distress (and possibly medicated)? The aforementioned experiences people have during near-death experiences have even been reproduced through electrical stimulation of peoples right angular gyrus in the temporal lobe, and similar feelings have occurred in pilots who have experienced G-LOC (g-force-induced loss of consciousness). This is not to say that people claiming to have seen an afterlife during a near-death experience are wrong, but you must be wary about their claims due to the very fact that alternate, more scientific explanations for these experiences are available to us. As Occam's Razor reminds us, the simplest explanation is usually the correct one. Even when it's not the most interesting.

Sources and Related Links

Anecdotal Evidence
Hume's Maxim
Extreme States | DiscoverMagazine.com
Out‐of‐body experience and autoscopy of neurological origin
Occam's Razor

2.) We can be Easily Fooled by Others

There are plenty of scammers out there, even where we'd least expect them

For as far back as humanities beginning there have been those willing to lie, confuse, trick, deceive, and take advantage of others in the pursuit of money, power, and the spread of personal agendas, beliefs, and ideologies. And if we’re not careful or skeptical enough, we’ll buy into whatever it is they’re selling.

Keeping this in mind, it’s no surprise, then, why magicians like Houdini, James Randi, and Penn & Teller have gone to such great lengths to debunk and disprove people who openly claim to perform “real magic”. Houdini and those magicians who followed in his shoes were, and are, professional illusionists who’ve worked long and hard to perfect their quirky craft of tricking and deceiving people for the sake of entertainment. They’ve learned all the tricks and know better than anyone the many, many ways that our brains and eyes can fool us into believing weird things. They could spot a fellow trickster from a mile away, and whenever they found one who professed to being a real psychic, mind reader, healer, or fortune teller, they would be sure to openly call them out on it.

As this article demonstrates, our brains fool us all the time by accident alone, due to nothing more than our natural flaws in reasoning and perception. Suffice it to say, if someone is actively and deliberately going out of their way, taking advantage of these flaws to trick us into believing something, they stand a fairly good chance of succeeding.

Examples of how others trick us into believing what they want

  • Psychics, fortune-tellers, mediums, faith healers, etc. use “cold readings” to get information from people without them being aware. They blurt out many general statements and questions until they hit on one that their target responds to. Ex. “Did someone die of lung cancer? Because I’m getting a pain here in my chest,” the psychic may say when supposedly being contacted by someone’s deceased loved one. The target (vulnerable and wanting to believe) may reply, “My father died of a heart attack, could that be it?” “Heart attack? Yes, that explains the chest pains.”

    For cold readings, you start by speaking generally, such as bringing up common ways of dying that have affected many people (car accidents, heart attacks, cancer) and once the target confirms a match you simply continue the technique, gathering more and more specific information about them as you go. Cold readings commonly employ high-probability guesses, quickly picking up on signals as to whether their guesses are in the right direction or not, then emphasizing and reinforcing chance connections and quickly moving on from missed guesses. (See selective memory above, to understand why we only remember the matches and never the misses).
  • In hot reading (often used in conjunction with cold reading) the reader uses information about their target from background research, overhearing a conversation, and other devious techniques to "read minds" and know about personal things. An example of this can be seen in the Related Links below under "James Randi Debunks Peter Popoff Faith Healer".
  • The Forer Effect (also referred to The Barnum Effect) is also utilized in what’s known as “warm reading”. This is where general personality descriptions (such as “Security is one of your major goals in life,” “You have a tendency to be critical of yourself,” or “You have trouble opening up to others” etc.) are brought up to describe someone. People believe this information is tailored specifically for them upon hearing it, not realizing the statements are vague and general enough to apply to a wide range of people. This technique is also widely used in horoscopes.
  • The most common career path for today’s psychology students reportedly leads to a life in advertising. Many successful advertisements are designed to take advantage of associations made in the brain and flaws in our cognition.
  • Magicians take advantage of our brains inability to visually focus on more than one thing at a time. This classic use of misdirection is the most common way illusionists fool us into believing something magical has occurred. It's also a good way to pick someone's pocket without them noticing.

Sources and Related Links

James Randi Debunks Peter Popoff Faith Healer
Barnum Effect
Cold Readings
Houdini | Debunking Spiritualists
Human Brain - How Does it Work

Learn more by reading...

1.) How Even a High Intellect can be Misdirected

It's not just dummies who believe dumb things

As mentioned before, belief can arise for many reasons, including biological, environmental, cultural and educational, or being deliberately fooled, falling prey to propaganda, and so forth. Most of such beliefs are instilled in us from a very young age, thus biasing us from our very beginnings; others we’ve come to for various reasons that may or may not go against our normal ways of thinking. Contrary to what many think, it’s not just that we’re naturally stupid or uneducated that we believe. The beliefs themselves can simply be an anomaly in an otherwise completely rational, intelligent mind that’s just so happen to have gone astray in one particular area. As a matter of fact, your high intelligence can actually contribute to strengthening your unintelligent beliefs.

It takes an especially keen mind to find copious patterns where there are none and to continuously find ways of rationalizing unequivocally irrational ideas in order to make them appear true, especially when all hard evidence points to the contrary. While someone can be an otherwise rational person, an instilled belief is a difficult thing to let go of. Thus, belief isn’t always something that only naturally gullible or simpleminded people are attracted to; but often it’s very intelligent people using their gifted and imaginative minds to seek out patterns and make connections that support beliefs they've come to for non-intelligent reasons.

It needs to be remembered that our ability to rationalize and detect new patterns didn't come about in order to hinder us. Nor did our minds ability to take mental shortcuts come about to make us more gullible. This complex and sophisticated information-processing capability is what's contributed most to our amazing strides in understanding the universe and ourselves (and even understanding our understanding). The fact that the same process happens to backfire occasionally is only a small by-product of being the thinking machines we undeniably are.

Examples of how intelligence can contribute to unintelligent beliefs

  • Mathematician John Nash (subject of the book and film, A Beautiful Mind) whose works in game theory, differential geometry, and partial differential equations changed the face of science and won him a Nobel Prize, was a schizophrenic whose ability to find brilliant scientific patterns unnoticed by others also contributed to his putting together of, and giving meaning to, delusional patterns that weren't really there. In the process, his intelligence helped him rationalize and reaffirm to himself that his auditory and perceptual hallucinations were real events.
  • Conspiracy theorists can have minds like steel traps, capable of remembering and recalling a vast quantity of names, information, and dates at the drop of a hat. And while the patterns they often find are false, it’s the their mental ability to find so many patterns and put them together (connecting dots even where there are no real connections) that shows just how well their brains function, regardless of how misguided they may be.
  • Father of modern physics and inventor of calculus, Isaac Newton spent much of his time and brilliance trying to decode scripture and create the philosophers stone through alchemy. A futile effort, we now know, but anyone confident enough in their beliefs can find any answers they want if they look hard enough, and the scientific method isn't around to fact check their results.

Sources and Related Links

Heuristics
Heuristics in Judgement and Decision Making
John Forbes Nash, Jr.
Isaac Newton, Worlds Most Famous Alchemist

Comments

    0 of 8192 characters used
    Post Comment

    • Akriti Mattu profile image

      Akriti Mattu 2 years ago from Shimla, India

      It took me a while to finish reading this post but wow it was worth it. Vote up.

    • TheGutterMonkey profile image
      Author

      The Gutter Monkey 2 years ago

      @ Kylssa

      I appreciate the visit, Kylssa. Hopefully the information here won't seem relevant to only the choir of skeptics and nonbelievers, though. The main point to get across is how all of us are so extremely susceptible to falling into the belief-trap. Not just for religion, but for almost anything. Although, my sneaking suspicion is that most who search out this topic will be doing so in an inadvertent effort to prove point #10, The Confirmation Bias, as being true.

    • fpherj48 profile image

      Paula 2 years ago from Beautiful Upstate New York

      Monkey.....THIS is one of the best...most interesting and informative Hubs I've read in a while. Just love this and you've written it so well.

      I'm one of those "curious" people who loves to learn and know more and more, especially when it comes to human nature and behavior.

      Voting this way Up and sharing! Simply fabulous!...UP+++pinned & tweeted

    • CatherineGiordano profile image

      Catherine Giordano 2 years ago from Orlando Florida

      Thanks for the book recommendation. "The Believing Brain" sounds like it is worth reading.

    • Kylyssa profile image

      Kylyssa Shay 2 years ago from Overlooking a meadow near Grand Rapids, Michigan, USA

      This is an excellent and engaging analysis. Then again, I'm part of the choir, so to speak.

    • TheGutterMonkey profile image
      Author

      The Gutter Monkey 2 years ago

      @ CatherineGiordano

      Hey, thanks for the kind words Catherine. Sorry if I beat you to the punch! Haha

      "Why People Believe Weird Things" is a very interesting book, I hope you get to it. However, my personal recommendation would be Shermer's surprisingly less talked about "The Believing Brain" which goes much deeper into the issues of belief in general (not to mention it's more up-to-date; "Why People Believe Weird Things", while undeniably great, is almost twenty years old already!).

      But anywho, thanks again!

    • CatherineGiordano profile image

      Catherine Giordano 2 years ago from Orlando Florida

      Wow, yu blew me away. I was thinking about writing on this subject as soon as I had time to review Micheal Schemer's book "Why people Believe Weird Things." You covered al the bases.