There was a scientific study that showed vaccines cause autism.” “Actually, the researcher in that study lost his medical license, and overwhelming research since then has shown no link between vaccines and autism.” “Well, regardless, it’s still my personal right as a parent to make decisions for my child.” Does that exchange sound familiar? A debate that starts with testable factual statements, but then, when the truth becomes inconvenient, the person takes a flight from facts. As public debate rages about issues such as immunization, Obamacare and same-sex marriage, many people try to use science to bolster their arguments. And because it’s becoming easier to test and establish facts—whether in physics, psychology or policy—many have wondered why bias and polarization have not been defeated. When people are confronted with facts, such as the well-established safety of immunization, why do these facts seem to have so little effect? Our research, published in 2015 in the Journal of Personality and Social Psychology, examined a slippery way by which people get away from facts that contradict their beliefs. Of course, sometimes people just dispute the validity of specific facts. But we find that sometimes they go one step further and, as in the opening example, reframe an issue in untestable ways. This makes potential important facts and science ultimately irrelevant to the issue. Let’s consider the issue of same-sex marriage. Facts could be relevant to whether it should be legal—for example, if data showed that children raised by same-sex parents are worse off—or just as well off—as children raised by opposite-sex parents. But what if those facts contradict one’s views? We presented 174 American participants who supported or opposed same-sex marriage with (supposed) scientific facts that supported or disputed their position. When the facts opposed their views, our participants—on both sides of the issue—were more likely to state that same-sex marriage isn’t actually about facts: it’s more a question of moral opinion. But when the facts were on their side, they more often stated that their opinions were fact-based and much less about morals. In other words, we observed something beyond the denial of particular facts: we observed a denial of the relevance of facts. In a similar study using 117 religious participants, we had some read an article critical of religion. Believers who were especially high (but not low) in religiosity were more likely to turn to more untestable “blind faith” arguments as reasons for their beliefs than to arguments based in factual evidence, compared with those who read a neutral article. These experiments show that when people’s beliefs are threatened, they often take flight to a land where facts do not matter. In scientific terms, their beliefs become less “falsifiable” because they can no longer be tested scientifically for verification or refutation. For instance, sometimes people dispute government policies based on the argument that they don’t work. Yet if facts suggest that the policies do work, the same person might stay resolvedly against the argument based on principle. We can see this on both sides of the political spectrum, whether it’s conservatives and Obamacare or liberals and the “surge” in Iraq in 2007. One would hope that objective facts could allow people to reach consensus more easily, but American politics are more polarized than ever. Could this polarization be a consequence of feeling free of facts? While it is difficult to objectively test that idea, we can experimentally assess a fundamental question: When people are made to see their important beliefs as relatively less rather than more testable, does it increase polarization and commitment to desired beliefs? Two experiments we conducted suggest so. In an experiment with 179 Americans (conducted during President Barack Obama’s tenure), we reminded roughly half of participants that much of Obama’s policy performance was empirically testable and did not remind the other half. Then participants rated Obama’s performance on five domains (such as job creation). Comparing opponents and supporters of Obama, we found that the reminder of testability reduced the average polarized assessments of Obama’s performance by about 40 percent. To further test the hypothesis that people strengthen their desired beliefs when the beliefs are free of facts, we looked at a sample of 103 participants that varied from highly to moderately religious. We found that when highly (but not more moderately) religious participants were told that God’s existence will always be untestable, they reported stronger desirable religious beliefs afterward (such as the belief that God was looking out for them), relative to when they were told that one day science might be able to investigate God’s existence. Together these findings show, at least in some cases, when testable facts are less a part of the discussion, people dig deeper into the beliefs they wish to have—such as viewing a politician in a certain way or believing God is constantly there to provide support. These results bear similarities to the many studies that find when facts are fuzzier, people tend to exaggerate desired beliefs. So after examining the power of untestable beliefs, what have we learned about dealing with human psychology? We have learned that bias is a disease and to fight it we need a healthy treatment of facts and education. We find that when facts are injected into the conversation, the symptoms of bias become less severe. But, unfortunately, we have also learned that facts can only do so much. To avoid coming to undesirable conclusions, people can fly from the facts and use other tools in their deep, belief-protecting toolbox. With the disease of bias, then, societal immunity is better achieved when people are encouraged to accept ambiguity, engage in critical thinking and reject strict ideology. This society is something the Common Core State Standards for education and at times The Daily Show are at least in theory helping to create. We will never eradicate bias—not from others, not from ourselves, not from society. But we can become more free of ideology and less free of facts.

As public debate rages about issues such as immunization, Obamacare and same-sex marriage, many people try to use science to bolster their arguments. And because it’s becoming easier to test and establish facts—whether in physics, psychology or policy—many have wondered why bias and polarization have not been defeated. When people are confronted with facts, such as the well-established safety of immunization, why do these facts seem to have so little effect?

Our research, published in 2015 in the Journal of Personality and Social Psychology, examined a slippery way by which people get away from facts that contradict their beliefs. Of course, sometimes people just dispute the validity of specific facts. But we find that sometimes they go one step further and, as in the opening example, reframe an issue in untestable ways. This makes potential important facts and science ultimately irrelevant to the issue.

Let’s consider the issue of same-sex marriage. Facts could be relevant to whether it should be legal—for example, if data showed that children raised by same-sex parents are worse off—or just as well off—as children raised by opposite-sex parents. But what if those facts contradict one’s views?

We presented 174 American participants who supported or opposed same-sex marriage with (supposed) scientific facts that supported or disputed their position. When the facts opposed their views, our participants—on both sides of the issue—were more likely to state that same-sex marriage isn’t actually about facts: it’s more a question of moral opinion. But when the facts were on their side, they more often stated that their opinions were fact-based and much less about morals. In other words, we observed something beyond the denial of particular facts: we observed a denial of the relevance of facts.

In a similar study using 117 religious participants, we had some read an article critical of religion. Believers who were especially high (but not low) in religiosity were more likely to turn to more untestable “blind faith” arguments as reasons for their beliefs than to arguments based in factual evidence, compared with those who read a neutral article.

These experiments show that when people’s beliefs are threatened, they often take flight to a land where facts do not matter. In scientific terms, their beliefs become less “falsifiable” because they can no longer be tested scientifically for verification or refutation.

For instance, sometimes people dispute government policies based on the argument that they don’t work. Yet if facts suggest that the policies do work, the same person might stay resolvedly against the argument based on principle. We can see this on both sides of the political spectrum, whether it’s conservatives and Obamacare or liberals and the “surge” in Iraq in 2007.

One would hope that objective facts could allow people to reach consensus more easily, but American politics are more polarized than ever. Could this polarization be a consequence of feeling free of facts?

While it is difficult to objectively test that idea, we can experimentally assess a fundamental question: When people are made to see their important beliefs as relatively less rather than more testable, does it increase polarization and commitment to desired beliefs? Two experiments we conducted suggest so.

In an experiment with 179 Americans (conducted during President Barack Obama’s tenure), we reminded roughly half of participants that much of Obama’s policy performance was empirically testable and did not remind the other half. Then participants rated Obama’s performance on five domains (such as job creation). Comparing opponents and supporters of Obama, we found that the reminder of testability reduced the average polarized assessments of Obama’s performance by about 40 percent.

To further test the hypothesis that people strengthen their desired beliefs when the beliefs are free of facts, we looked at a sample of 103 participants that varied from highly to moderately religious. We found that when highly (but not more moderately) religious participants were told that God’s existence will always be untestable, they reported stronger desirable religious beliefs afterward (such as the belief that God was looking out for them), relative to when they were told that one day science might be able to investigate God’s existence.

Together these findings show, at least in some cases, when testable facts are less a part of the discussion, people dig deeper into the beliefs they wish to have—such as viewing a politician in a certain way or believing God is constantly there to provide support. These results bear similarities to the many studies that find when facts are fuzzier, people tend to exaggerate desired beliefs.

So after examining the power of untestable beliefs, what have we learned about dealing with human psychology? We have learned that bias is a disease and to fight it we need a healthy treatment of facts and education. We find that when facts are injected into the conversation, the symptoms of bias become less severe. But, unfortunately, we have also learned that facts can only do so much. To avoid coming to undesirable conclusions, people can fly from the facts and use other tools in their deep, belief-protecting toolbox.

With the disease of bias, then, societal immunity is better achieved when people are encouraged to accept ambiguity, engage in critical thinking and reject strict ideology. This society is something the Common Core State Standards for education and at times The Daily Show are at least in theory helping to create. We will never eradicate bias—not from others, not from ourselves, not from society. But we can become more free of ideology and less free of facts.