In principle, science should set itself apart from the hue and cry of partisan bickering. After all, the scientific enterprise reaches its conclusions by testing hypotheses about the workings of the natural world. Consider the porpoise. Based on its appearance and aquatic home, the animal was assumed to be a fish. But evidence gleaned from observing its bone structure, its lack of gills and the genes it holds in common with other warm-blooded land animals leads to its being classified as a mammal with a very high level of confidence. Yet a consensus about what constitutes a fact does not always come so readily. Take a glance at your online news feed. On a regular basis, government decision-makers enact policies that fail to heed decades of evidence on climate change. In public opinion surveys, a majority of Americans choose not to accept more than a century’s worth of evidence on evolution by natural selection. Academic intellectuals put the word “science” in quotes, and members of the lay public reject vaccinations for their children. Scientific findings have long met with ambivalent responses: A welcome mat rolls out instantly for horseless buggies or the latest smartphones. But hostility arises just as quickly when scientists’ findings challenge the political or religious status quo. Some of the British clergy strongly resisted Charles Darwin’s theory of evolution by natural selection. Samuel Wilberforce, bishop of Oxford, asked natural selection proponent Thomas Huxley, known as “Darwin’s bulldog,” on which side of his family Huxley claimed descent from an ape. In Galileo’s time, officials of the Roman Catholic Church, well-educated and progressive intellectuals in most respects, expressed outrage when the Renaissance scientist reported celestial observations that questioned the prevailing belief that Earth was the center of the universe. Galileo was placed under house arrest and forced to recant his views as heresy. In principle, scientific thinking should lead to decisions based on consideration of all available information on a given question. When scientists encounter arguments not firmly grounded in logic and empirical evidence, they often presume that purveyors of those alternative views either are ignorant of the facts or are attempting to discourage their distribution for self-serving reasons—tobacco company executives suppressing findings linking tobacco use to lung cancer, for instance. Faced with irrational or tendentious opponents, scientists often grow increasingly strident. They respond by stating the facts more loudly and clearly in the hope that their interlocutors will make more educated decisions. Several lines of research, however, reveal that simply presenting a litany of facts does not always lead to more objective decision-making. Indeed, in some cases, this approach might actually backfire. Human beings are intelligent creatures, capable of masterful intellectual accomplishments. Unfortunately, we are not completely rational decision-makers. Understanding why people engage in irrational thinking requires combining knowledge from a range of psychological disciplines. As authors, each of us studies a separate area addressing how biased views originate. One of us (Cialdini) has expertise in heuristics, the rules that help us to quickly make everyday choices. Another of the authors (Kenrick) has studied how decisions are distorted by social motives such as the desire to find a mate or protect oneself from physical harm. Yet another of us—Cohen—has investigated how religious beliefs affect judgment. Finally, Neuberg has studied simple cognitive biases that lead people to hold on to existing beliefs when confronted with new and conflicting evidence. All of us, in different ways, have tried to develop a deeper understanding of the psychological mechanisms that warp rationality. Explaining why thinking goes astray is critically important to dispel false beliefs that circulate among politicians, students or even misinformed neighbors. Our own research and that of our colleagues have identified key obstacles that stand in the way of clear scientific thought. We have investigated why they arise and how they might be challenged and ultimately knocked down. Among the many hurdles, three in particular stand out: Shortcuts. Human brains are endowed with a facile means for dealing with information overload. When we are overwhelmed or are too short on time, we rely on simple heuristics, such as accepting the group consensus or trusting an expert. Confirmation Bias. Even with ample time and sufficient interest to move beyond shortcuts, we sometimes process information in a manner less like an impartial judge and more like a lawyer working for the mob. We show a natural tendency to pay attention to some findings over others and to reinterpret mixed evidence to fit with preexisting beliefs. Social Goals. Even if we surmount the first two obstacles, powerful forms of social motivation can interfere with an objective analysis of the facts at hand. Whether one is biased toward reaching one scientific conclusion versus another can be influenced by the desire to win status, to conform to the views of a social network or even to attract a mate. Beware the Shortcut Mastery of the sciences requires dealing with a set of difficult concepts. Take Darwin’s theory of natural selection. To understand it, one must comprehend a set of logical premises—that environments with limited resources favor individuals who are better able to procure food, shelter and mates, thereby leading to selective representation of traits that confer these skills to future generations. The student of Darwinian theory must also know something about comparative anatomy (whales have bone structures more similar to those of humans than to those of fish). Another prerequisite is familiarity with ecology, modern genetics and the fossil record. Although natural selection stands out as one of the most solidly supported scientific theories ever advanced, the average citizen has not waded through textbooks full of evidence on the topic. In fact, many of those who have earned doctorates in scientific fields, even for medical research, have never taken a formal course in evolutionary biology. In the face of these challenges, most people rely on mental shortcuts or the pronouncements of experts, both strategies that can lead them astray. They may also rely—at their own peril—on intuition and gut instinct. We use heuristics because they frequently work quite well. If a computer malfunctions, users can spend months learning about its various electronic components and how they are connected—or they can ask a computer technician. If a child develops a serious health problem, parents can study the medical literature or consult a physician. But sometimes shortcuts serve us poorly. Consider a classic 1966 study by psychiatrist Charles K. Hofling and his colleagues on how things can go terribly wrong when people rely on the title “Dr.” as a cue to an individual’s authority. In the study, nurses working on a busy hospital ward received a phone call from a man who identified himself as the physician of a patient on their floor. The stranger on the phone asked the nurses on duty to go to the medicine cabinet and retrieve an unfamiliar drug called Astroten and to administer a dose twice as high as the daily maximum, violating not only the boldly stated guidelines on the label but also a hospital policy requiring handwritten prescriptions. Did the nurses balk? Ninety-five percent obeyed the unknown “doctor” without raising any questions. Indeed, they had to be stopped on their way to the patient’s room with the potentially dangerous drug in hand. The nurses had unknowingly applied what is known as the authority heuristic, trusting too readily in a person in a position of responsibility.

March for Science in Los Angeles, one of many held in 2017, tried to bolster support for the scientific community and for dealing with issues such as climate change. Pro-Trump counterdemonstrators also rallied. Credit: Sarah Morris Getty Images

Confirmation Bias When we care enough about a topic and have the time to think about it, we move beyond simple heuristics to a more systematic analysis of the actual evidence. But even when we try hard to retain an objective perspective, our existing knowledge may still get in the way. Abundant evidence suggests that people pay selective attention to arguments that simply reinforce their own viewpoints. They find disagreement unpleasant and are inclined to dislike the bearer of positions that run counter to their current beliefs. But what happens if intelligent individuals are forced to consider evidence on both sides of an issue? In 1979 Charles Lord, then at Stanford University, and his colleagues conducted a study with Stanford students, who should have been able to make reasonable judgments about scientific information. The students were exposed to several rounds of scientific evidence on the deterrence effect of the death penalty. They might first read a description of a study that questioned whether capital punishment prevents serious crime. It compared murder rates for the year before and the year after the implementation of capital punishment in 14 states. In 11 of the states, murder rates climbed after the death penalty was established, implying that it lacks a deterrent effect. Next, the students heard arguments from other scientists about possible weaknesses in that study’s evidence. Then the original researchers came back with counterarguments. After that, the students heard about a different type of study suggesting the opposite: that capital punishment stops others from committing crimes. In it, researchers compared murder rates in 10 pairs of neighboring states with different capital punishment laws. In eight of the paired states, murder rates notched lower with capital punishment on the books, supporting the death penalty. Then students heard that evidence challenged, followed by a counterargument to that challenge. If the students began with a strong opinion one way or the other and then performed a cold, rational analysis of the facts, they might have been expected to gravitate toward a middle ground in their views, having just heard a mix of evidence that included scientific claims that contradicted both positions for and positions against capital punishment. But that is not what happened. Rather students who previously favored the death penalty became even more disposed toward it, and opponents of it turned more disapproving. It became clear that students on either side of the issue had not processed the information in an evenhanded manner. Instead they believed evidence that reinforced their position was stronger, whereas refutations of that evidence were weak. So even if counterarguments can make it past our inner censors, we show an inclination to weigh those arguments in a very biased manner. A study published in September 2017, by Anthony N. Washburn and Linda J. Skitka, both then at the University of Illinois at Chicago, seems to reinforce the Stanford paper’s findings. The investigators tested the hypothesis that conservatives are more distrustful of scientific evidence than liberals, perhaps because such individuals exhibit rigid thinking and are less open to new experiences. What they discovered, though, is that those on both the right and the left reject scientific findings that do not jibe with their own political ideologies. The authors gave 1,347 study participants scientific evidence on six hot-button issues—climate change, gun control, health-care reform, immigration, nuclear power and same-sex marriage. A cursory look at the evidence from scientific studies tended to favor one side of the issue—the absolute numbers of crimes in cities with stricter gun control might be higher than in cities without it. But a closer look at the data might give credence to the opposite view—percentage crime reductions in those same cities might actually be greater than they were for cities lacking gun-control laws. If the initial hasty inspection of the data tended to favor the anti-gun-control group’s expectations, members would generally look no further, content with finding results that supported their particular bias. If the results contradicted the beliefs of the gun advocates, they would scrutinize the details of the study until they discovered the numbers that suggested the opposite conclusion. If the researchers, moreover, later told one of the groups that results favored the opposite side, its members tended to be skeptical of the scientists who conducted the studies. The Social Pressure Gauntlet Additional obstacles arise from the same powerful social impulses that help us get along with others. Take the scenario of an office party where an individual’s co-workers sound off on erroneous claims about evolution, global warming or evidence linking vaccines to autism. Confronted with that situation, does one object or keep quiet to avoid seeming disruptive? Research on conformity runs deep in the psychological annals. In a classic 1951 study of group dynamics, psychologist Stanley Schachter observed what happened to an individual who disagreed with the majority’s consensus. After trying unsuccessfully to change the divergent opinion, other group members ended up cutting off any further communication, ostracizing the outlier. A 2003 functional magnetic resonance imaging study by Kipling D. Williams, now at Purdue University, and his colleagues found that ostracism activates the brain’s dorsal anterior cingulate cortex—the same region recruited when we experience physical pain. In a 2005 study, a team of researchers led by Gregory Berns, a neuroeconomics professor at Emory University, and his colleagues found that disagreeing with a group to which you belong is associated with increased activity in the amygdala, an area that turns on in response to different types of stress. Holding an opinion different from other group members, even a correct one, hurts emotionally. It therefore comes as no surprise that people are often reluctant to provide evidence counter to what the rest of their group believes. Social pressures can also influence how we process new information. Group consensus may encourage us to take recourse in heuristics or to cling tightly to an opinion, all of which can interfere with objective thinking. Our own research team conducted a study in which participants would make aesthetic judgments about a series of abstract designs and paintings and then read a passage designed to put them in either a self-protective or a romantic frame of mind. In the former condition, you might be asked to imagine being awakened by a loud sound while alone at home. As the scenario unfolds, it becomes clear that an intruder has entered the house. You imagine reaching for the phone but finding that the line is dead. A call for help receives no response. Suddenly, the door to the bedroom bursts open to reveal the dark shadow of a stranger standing there. Alternatively, you might be randomly assigned to read an account of a romantic encounter and asked to imagine being on vacation and meeting an attractive person, then spending a romantic day with the partner that ends with a passionate kiss. Next you would enter a virtual chat room, joining three other participants to evaluate abstract images, including one you had earlier judged as of average interest. Before making the second judgment, though, you learn that this image has been rated as way below average by the other subjects. So did study subjects change their initial judgment to conform to the other group members? How people responded depended on their current goals. Study participants who had read the home break-in scenario were more likely to conform to the group judgment. In contrast, those exposed to the amorous story answered differently depending on gender: women conformed, but men actually went against the group’s judgment. Other studies by our team have found that fear can lead both men and women to comply with group opinion, whereas sexual motives prompt men to try to stand out from the group, perhaps to show that they are worthy mates. Men, in this frame of mind, are more likely to challenge the consensus and increase the riskiness of their actions. In all cases, though, our participants’ views were shaped by their social goals in the moment. They did not process available information in a completely objective way. What to Do If the human mind is built with so many obstacles to objective scientific thinking, should we just give up and accept that ignorance and bias will always triumph? Not at all. Research in social psychology also suggests ways of coping with heuristics, confirmation biases and social pressures. We have seen that people frequently rely on heuristics when they lack the time or interest to carefully consider the evidence. But such rules of thumb can often be defeated with simple interventions. In one experiment by market researchers Joseph W. Alba and Howard Marmorstein, subjects considered information about a dozen separate features of two cameras. Brand A was superior to brand B on just four of the features, but these were features critical in considering camera quality—the exposure accuracy, for instance. Brand B, on the other hand, came recommended as superior on eight features, all of which were relatively unimportant—having a shoulder strap, for example. Some subjects examined each attribute for only two seconds; others had more time to study all the information. When they had only two seconds to evaluate each feature, only a few subjects (17 percent) preferred the higher-quality camera, most opting instead for the one with a greater number of unimportant functions. When the subjects were given sufficient time and allowed to directly compare the two cameras, however, more than two thirds favored the camera with the few features key to its overall quality. These results suggest that when communicating complicated evidence, sufficient time is needed to switch from a heuristic to a systematic mode of thinking that allows for better overall evaluation. Confirmation biases can often be overcome by changing one’s perspective. The same Stanford researchers who studied attitudes toward capital punishment also investigated how to change them. They instructed some students to remain objective and weigh evidence impartially in making a hypothetical decision related to the death penalty. That instruction had no effect. Others were asked to play their own devil’s advocate by considering what their opinions would have been if the research about the death penalty had contradicted their own views. Biases suddenly vanished—students no longer used new evidence to bolster existing preconceptions. One way to counteract social pressures requires first exploring whether agreement within the group really exists. Someone who disagrees with an erroneous opinion can sometimes open other group members’ minds. In a 1955 Scientific American article, social psychologist Solomon E. Asch described studies on conformity, finding that if a single person in the group disagreed with the majority, consensus broke down. Similarly, in Stanley Milgram’s famed studies of obedience—in which participants were led to believe that they were delivering painful shocks to an individual with a heart problem—unquestioned obedience dissipated if other team members chose not to obey. Fear increases the tendency toward conformity. If you wish to persuade others to reduce carbon emissions, take care whom you scare: a message that arouses fear of a dystopian future might work well for an audience that accepts the reality of climate change but is likely to backfire for a skeptical audience. We have provided a few simple suggestions for overcoming psychological obstacles to objective scientific thinking. There is a large literature on persuasion and social influence that could be quite useful to anyone attempting to communicate with a group holding beliefs that fly in the face of scientific evidence. For their part, scientists need to adopt a more systematic approach in collecting their own data on the effectiveness of different strategies for confronting antiscientific thinking about particular issues. It is essential to understand whether an individual’s resistance to solid evidence is based on simple heuristic thinking, systematic bias or particular social motives. These steps are critical because antiscientific beliefs can lead to reduced research funding and a consequent failure to fully understand potentially important phenomena that affect public welfare. In recent decades government funding has decreased for research into the health impact of keeping guns in the home and of reducing the harmful effects of air pollution. Guns in the home are frequently involved in teenage suicides, and an overwhelming scientific consensus shows that immediate measures are needed to address the planet’s warming. It is easy to feel helpless in the face of our reluctance to embrace novel scientific findings. Still, there is room for optimism: the majority of Galileo’s fellow Italians and even the pope now accept that our planet revolves around the sun, and most of Darwin’s compatriots today endorse the theory of evolution. Indeed, the Anglican Church’s director of public affairs wrote an apology to Darwin for the 200th anniversary of his birth. If scientists can incorporate the insights of research on the psychological obstacles to objective thinking, more people will accept objective evidence of how the natural world functions as well.

Yet a consensus about what constitutes a fact does not always come so readily. Take a glance at your online news feed. On a regular basis, government decision-makers enact policies that fail to heed decades of evidence on climate change. In public opinion surveys, a majority of Americans choose not to accept more than a century’s worth of evidence on evolution by natural selection. Academic intellectuals put the word “science” in quotes, and members of the lay public reject vaccinations for their children.

Scientific findings have long met with ambivalent responses: A welcome mat rolls out instantly for horseless buggies or the latest smartphones. But hostility arises just as quickly when scientists’ findings challenge the political or religious status quo. Some of the British clergy strongly resisted Charles Darwin’s theory of evolution by natural selection. Samuel Wilberforce, bishop of Oxford, asked natural selection proponent Thomas Huxley, known as “Darwin’s bulldog,” on which side of his family Huxley claimed descent from an ape.

In Galileo’s time, officials of the Roman Catholic Church, well-educated and progressive intellectuals in most respects, expressed outrage when the Renaissance scientist reported celestial observations that questioned the prevailing belief that Earth was the center of the universe. Galileo was placed under house arrest and forced to recant his views as heresy.

In principle, scientific thinking should lead to decisions based on consideration of all available information on a given question. When scientists encounter arguments not firmly grounded in logic and empirical evidence, they often presume that purveyors of those alternative views either are ignorant of the facts or are attempting to discourage their distribution for self-serving reasons—tobacco company executives suppressing findings linking tobacco use to lung cancer, for instance. Faced with irrational or tendentious opponents, scientists often grow increasingly strident. They respond by stating the facts more loudly and clearly in the hope that their interlocutors will make more educated decisions.

Several lines of research, however, reveal that simply presenting a litany of facts does not always lead to more objective decision-making. Indeed, in some cases, this approach might actually backfire. Human beings are intelligent creatures, capable of masterful intellectual accomplishments. Unfortunately, we are not completely rational decision-makers.

Understanding why people engage in irrational thinking requires combining knowledge from a range of psychological disciplines. As authors, each of us studies a separate area addressing how biased views originate. One of us (Cialdini) has expertise in heuristics, the rules that help us to quickly make everyday choices. Another of the authors (Kenrick) has studied how decisions are distorted by social motives such as the desire to find a mate or protect oneself from physical harm.

Yet another of us—Cohen—has investigated how religious beliefs affect judgment. Finally, Neuberg has studied simple cognitive biases that lead people to hold on to existing beliefs when confronted with new and conflicting evidence. All of us, in different ways, have tried to develop a deeper understanding of the psychological mechanisms that warp rationality.

Explaining why thinking goes astray is critically important to dispel false beliefs that circulate among politicians, students or even misinformed neighbors. Our own research and that of our colleagues have identified key obstacles that stand in the way of clear scientific thought. We have investigated why they arise and how they might be challenged and ultimately knocked down. Among the many hurdles, three in particular stand out:

Shortcuts. Human brains are endowed with a facile means for dealing with information overload. When we are overwhelmed or are too short on time, we rely on simple heuristics, such as accepting the group consensus or trusting an expert.

Confirmation Bias. Even with ample time and sufficient interest to move beyond shortcuts, we sometimes process information in a manner less like an impartial judge and more like a lawyer working for the mob. We show a natural tendency to pay attention to some findings over others and to reinterpret mixed evidence to fit with preexisting beliefs.

Social Goals. Even if we surmount the first two obstacles, powerful forms of social motivation can interfere with an objective analysis of the facts at hand. Whether one is biased toward reaching one scientific conclusion versus another can be influenced by the desire to win status, to conform to the views of a social network or even to attract a mate.

Beware the Shortcut

Mastery of the sciences requires dealing with a set of difficult concepts. Take Darwin’s theory of natural selection. To understand it, one must comprehend a set of logical premises—that environments with limited resources favor individuals who are better able to procure food, shelter and mates, thereby leading to selective representation of traits that confer these skills to future generations. The student of Darwinian theory must also know something about comparative anatomy (whales have bone structures more similar to those of humans than to those of fish). Another prerequisite is familiarity with ecology, modern genetics and the fossil record.

Although natural selection stands out as one of the most solidly supported scientific theories ever advanced, the average citizen has not waded through textbooks full of evidence on the topic. In fact, many of those who have earned doctorates in scientific fields, even for medical research, have never taken a formal course in evolutionary biology. In the face of these challenges, most people rely on mental shortcuts or the pronouncements of experts, both strategies that can lead them astray. They may also rely—at their own peril—on intuition and gut instinct.

We use heuristics because they frequently work quite well. If a computer malfunctions, users can spend months learning about its various electronic components and how they are connected—or they can ask a computer technician. If a child develops a serious health problem, parents can study the medical literature or consult a physician.

But sometimes shortcuts serve us poorly. Consider a classic 1966 study by psychiatrist Charles K. Hofling and his colleagues on how things can go terribly wrong when people rely on the title “Dr.” as a cue to an individual’s authority. In the study, nurses working on a busy hospital ward received a phone call from a man who identified himself as the physician of a patient on their floor. The stranger on the phone asked the nurses on duty to go to the medicine cabinet and retrieve an unfamiliar drug called Astroten and to administer a dose twice as high as the daily maximum, violating not only the boldly stated guidelines on the label but also a hospital policy requiring handwritten prescriptions. Did the nurses balk? Ninety-five percent obeyed the unknown “doctor” without raising any questions. Indeed, they had to be stopped on their way to the patient’s room with the potentially dangerous drug in hand. The nurses had unknowingly applied what is known as the authority heuristic, trusting too readily in a person in a position of responsibility.

Confirmation Bias

When we care enough about a topic and have the time to think about it, we move beyond simple heuristics to a more systematic analysis of the actual evidence. But even when we try hard to retain an objective perspective, our existing knowledge may still get in the way.

Abundant evidence suggests that people pay selective attention to arguments that simply reinforce their own viewpoints. They find disagreement unpleasant and are inclined to dislike the bearer of positions that run counter to their current beliefs. But what happens if intelligent individuals are forced to consider evidence on both sides of an issue?

In 1979 Charles Lord, then at Stanford University, and his colleagues conducted a study with Stanford students, who should have been able to make reasonable judgments about scientific information. The students were exposed to several rounds of scientific evidence on the deterrence effect of the death penalty. They might first read a description of a study that questioned whether capital punishment prevents serious crime. It compared murder rates for the year before and the year after the implementation of capital punishment in 14 states. In 11 of the states, murder rates climbed after the death penalty was established, implying that it lacks a deterrent effect.

Next, the students heard arguments from other scientists about possible weaknesses in that study’s evidence. Then the original researchers came back with counterarguments. After that, the students heard about a different type of study suggesting the opposite: that capital punishment stops others from committing crimes. In it, researchers compared murder rates in 10 pairs of neighboring states with different capital punishment laws. In eight of the paired states, murder rates notched lower with capital punishment on the books, supporting the death penalty. Then students heard that evidence challenged, followed by a counterargument to that challenge.

If the students began with a strong opinion one way or the other and then performed a cold, rational analysis of the facts, they might have been expected to gravitate toward a middle ground in their views, having just heard a mix of evidence that included scientific claims that contradicted both positions for and positions against capital punishment. But that is not what happened. Rather students who previously favored the death penalty became even more disposed toward it, and opponents of it turned more disapproving. It became clear that students on either side of the issue had not processed the information in an evenhanded manner. Instead they believed evidence that reinforced their position was stronger, whereas refutations of that evidence were weak. So even if counterarguments can make it past our inner censors, we show an inclination to weigh those arguments in a very biased manner.

A study published in September 2017, by Anthony N. Washburn and Linda J. Skitka, both then at the University of Illinois at Chicago, seems to reinforce the Stanford paper’s findings. The investigators tested the hypothesis that conservatives are more distrustful of scientific evidence than liberals, perhaps because such individuals exhibit rigid thinking and are less open to new experiences. What they discovered, though, is that those on both the right and the left reject scientific findings that do not jibe with their own political ideologies. The authors gave 1,347 study participants scientific evidence on six hot-button issues—climate change, gun control, health-care reform, immigration, nuclear power and same-sex marriage. A cursory look at the evidence from scientific studies tended to favor one side of the issue—the absolute numbers of crimes in cities with stricter gun control might be higher than in cities without it. But a closer look at the data might give credence to the opposite view—percentage crime reductions in those same cities might actually be greater than they were for cities lacking gun-control laws.

If the initial hasty inspection of the data tended to favor the anti-gun-control group’s expectations, members would generally look no further, content with finding results that supported their particular bias. If the results contradicted the beliefs of the gun advocates, they would scrutinize the details of the study until they discovered the numbers that suggested the opposite conclusion. If the researchers, moreover, later told one of the groups that results favored the opposite side, its members tended to be skeptical of the scientists who conducted the studies.

The Social Pressure Gauntlet

Additional obstacles arise from the same powerful social impulses that help us get along with others. Take the scenario of an office party where an individual’s co-workers sound off on erroneous claims about evolution, global warming or evidence linking vaccines to autism. Confronted with that situation, does one object or keep quiet to avoid seeming disruptive?

Research on conformity runs deep in the psychological annals. In a classic 1951 study of group dynamics, psychologist Stanley Schachter observed what happened to an individual who disagreed with the majority’s consensus. After trying unsuccessfully to change the divergent opinion, other group members ended up cutting off any further communication, ostracizing the outlier. A 2003 functional magnetic resonance imaging study by Kipling D. Williams, now at Purdue University, and his colleagues found that ostracism activates the brain’s dorsal anterior cingulate cortex—the same region recruited when we experience physical pain. In a 2005 study, a team of researchers led by Gregory Berns, a neuroeconomics professor at Emory University, and his colleagues found that disagreeing with a group to which you belong is associated with increased activity in the amygdala, an area that turns on in response to different types of stress. Holding an opinion different from other group members, even a correct one, hurts emotionally. It therefore comes as no surprise that people are often reluctant to provide evidence counter to what the rest of their group believes.

Social pressures can also influence how we process new information. Group consensus may encourage us to take recourse in heuristics or to cling tightly to an opinion, all of which can interfere with objective thinking.

Our own research team conducted a study in which participants would make aesthetic judgments about a series of abstract designs and paintings and then read a passage designed to put them in either a self-protective or a romantic frame of mind. In the former condition, you might be asked to imagine being awakened by a loud sound while alone at home. As the scenario unfolds, it becomes clear that an intruder has entered the house. You imagine reaching for the phone but finding that the line is dead. A call for help receives no response. Suddenly, the door to the bedroom bursts open to reveal the dark shadow of a stranger standing there.

Alternatively, you might be randomly assigned to read an account of a romantic encounter and asked to imagine being on vacation and meeting an attractive person, then spending a romantic day with the partner that ends with a passionate kiss. Next you would enter a virtual chat room, joining three other participants to evaluate abstract images, including one you had earlier judged as of average interest. Before making the second judgment, though, you learn that this image has been rated as way below average by the other subjects.

So did study subjects change their initial judgment to conform to the other group members? How people responded depended on their current goals. Study participants who had read the home break-in scenario were more likely to conform to the group judgment. In contrast, those exposed to the amorous story answered differently depending on gender: women conformed, but men actually went against the group’s judgment.

Other studies by our team have found that fear can lead both men and women to comply with group opinion, whereas sexual motives prompt men to try to stand out from the group, perhaps to show that they are worthy mates. Men, in this frame of mind, are more likely to challenge the consensus and increase the riskiness of their actions. In all cases, though, our participants’ views were shaped by their social goals in the moment. They did not process available information in a completely objective way.

What to Do

If the human mind is built with so many obstacles to objective scientific thinking, should we just give up and accept that ignorance and bias will always triumph? Not at all. Research in social psychology also suggests ways of coping with heuristics, confirmation biases and social pressures.

We have seen that people frequently rely on heuristics when they lack the time or interest to carefully consider the evidence. But such rules of thumb can often be defeated with simple interventions. In one experiment by market researchers Joseph W. Alba and Howard Marmorstein, subjects considered information about a dozen separate features of two cameras. Brand A was superior to brand B on just four of the features, but these were features critical in considering camera quality—the exposure accuracy, for instance. Brand B, on the other hand, came recommended as superior on eight features, all of which were relatively unimportant—having a shoulder strap, for example. Some subjects examined each attribute for only two seconds; others had more time to study all the information.

When they had only two seconds to evaluate each feature, only a few subjects (17 percent) preferred the higher-quality camera, most opting instead for the one with a greater number of unimportant functions. When the subjects were given sufficient time and allowed to directly compare the two cameras, however, more than two thirds favored the camera with the few features key to its overall quality. These results suggest that when communicating complicated evidence, sufficient time is needed to switch from a heuristic to a systematic mode of thinking that allows for better overall evaluation.

Confirmation biases can often be overcome by changing one’s perspective. The same Stanford researchers who studied attitudes toward capital punishment also investigated how to change them. They instructed some students to remain objective and weigh evidence impartially in making a hypothetical decision related to the death penalty. That instruction had no effect. Others were asked to play their own devil’s advocate by considering what their opinions would have been if the research about the death penalty had contradicted their own views. Biases suddenly vanished—students no longer used new evidence to bolster existing preconceptions.

One way to counteract social pressures requires first exploring whether agreement within the group really exists. Someone who disagrees with an erroneous opinion can sometimes open other group members’ minds. In a 1955 Scientific American article, social psychologist Solomon E. Asch described studies on conformity, finding that if a single person in the group disagreed with the majority, consensus broke down. Similarly, in Stanley Milgram’s famed studies of obedience—in which participants were led to believe that they were delivering painful shocks to an individual with a heart problem—unquestioned obedience dissipated if other team members chose not to obey.

Fear increases the tendency toward conformity. If you wish to persuade others to reduce carbon emissions, take care whom you scare: a message that arouses fear of a dystopian future might work well for an audience that accepts the reality of climate change but is likely to backfire for a skeptical audience.

We have provided a few simple suggestions for overcoming psychological obstacles to objective scientific thinking. There is a large literature on persuasion and social influence that could be quite useful to anyone attempting to communicate with a group holding beliefs that fly in the face of scientific evidence. For their part, scientists need to adopt a more systematic approach in collecting their own data on the effectiveness of different strategies for confronting antiscientific thinking about particular issues. It is essential to understand whether an individual’s resistance to solid evidence is based on simple heuristic thinking, systematic bias or particular social motives.

These steps are critical because antiscientific beliefs can lead to reduced research funding and a consequent failure to fully understand potentially important phenomena that affect public welfare. In recent decades government funding has decreased for research into the health impact of keeping guns in the home and of reducing the harmful effects of air pollution. Guns in the home are frequently involved in teenage suicides, and an overwhelming scientific consensus shows that immediate measures are needed to address the planet’s warming.

It is easy to feel helpless in the face of our reluctance to embrace novel scientific findings. Still, there is room for optimism: the majority of Galileo’s fellow Italians and even the pope now accept that our planet revolves around the sun, and most of Darwin’s compatriots today endorse the theory of evolution. Indeed, the Anglican Church’s director of public affairs wrote an apology to Darwin for the 200th anniversary of his birth. If scientists can incorporate the insights of research on the psychological obstacles to objective thinking, more people will accept objective evidence of how the natural world functions as well.