Contents [ hide] So she did. 1. How can we avoidlosing ourminds when trying to talk facts? Some students discovered that they had a genius for the task. The psychology behind our limitations of reason. Ad Choices. The belief that vaccines cause autism has persisted, even though the facts paint an entirely different story. Mercier, who works at a French research institute . James Clear writes about habits, decision making, and continuous improvement. They dont need to wrestle with you too. 2017. Kolbert relates this to our ancestors saying that they were, primarily concerned with their social standing, and with making sure that they werent the ones risking their lives on the hunt while others loafed around in the cave. These people did not want to solve problems like confirmation bias, And an article I found from newscientist.com agrees, saying that It expresses the tribal thinking that evolution has gifted us a tendency to seek and accept evidence that supports what we already believe. But if this idea is so ancient, why does Kolbert argue that it is still a very prevalent issue and how does she say we can avoid it? When it comes to changing peoples minds, it is very difficult to jump from one side to another. These misperceptions are bad for public policy and social health. But I knowwhere shes coming from, so she is probably not being fully accurate,the Republican might think while half-listening to the Democrats explanation. Changing our mind requires us, at some level, to concede we once held the "wrong" position on something. Analytical Youll understand the inner workings of the subject matter. There is another reason bad ideas continue to live on, which is that people continue to talk about them. I study human development, public health and behavior change. Peoples ability to reason is subject to a staggering number of biases. Such inclinations are essential to our survival. In a well-run laboratory, theres no room for myside bias; the results have to be reproducible in other laboratories, by researchers who have no motive to confirm them. Isnt it amazing how when someone is wrong and you tell them the factual, sometimes scientific, truth, they quickly admit they were wrong? Among the many, many issues our forebears didn't worry about were the deterrent effects of capital punishment and the ideal attributes of a firefighter. Some students believed it deterred crime, while others said it had no effect. Found a perfect sample but need a unique one? Of course, whats hazardous is not being vaccinated; thats why vaccines were created in the first place. In a new book, The Enigma of Reason (Harvard), the cognitive scientists Hugo Mercier and Dan Sperber take a stab at answering this question. She asks why we stick to our guns even after new evidence is shown to prove us wrong. Others discovered that they were hopeless. A helpful and/or enlightening book that is extremely well rounded, has many strengths and no shortcomings worth mentioning. Their concern is with those persistent beliefs which are not just demonstrably false but also potentially deadly, like the conviction that vaccines are hazardous. Not usually, anyway. Reason developed not to enable us to solve abstract, logical problems or even to help us draw conclusions from unfamiliar data; rather, it developed to resolve the problems posed by living in collaborative groups. The fact that both we and it survive, Mercier and Sperber argue, proves that it must have some adaptive function, and that function, they maintain, is related to our hypersociability. Mercier and Sperber prefer the term myside bias. Humans, they point out, arent randomly credulous. Coperation is difficult to establish and almost as difficult to sustain. Books we rate below 5 wont be summarized. Create and share a new lesson based on this one. The tendency to selectively pay attention to information that supports our beliefs and ignore information that contradicts them. Instead of just arguing with family and friends, they went to work. This lopsidedness, according to Mercier and Sperber, reflects the task that reason evolved to perform, which is to prevent us from getting screwed by the other members of our group. At any given moment, a field may be dominated by squabbles, but, in the end, the methodology prevails. The Grinch's heart growing three sizes after seeing the fact that the Whos do not only care about presents, Ebenezer Scrooge helping Bob Cratchit after being shown what will happen in the future if he does not change, and Darth Vader saving Luke Skywalker after realizing that though he has done bad things the fact remains that he is still good, none of these scenarios would make sense if humans could not let facts change what they believe to be true, even if based on false information. When youre at Position 7, your time is better spent connecting with people who are at Positions 6 and 8, gradually pulling them in your direction. Kolbert cherry picks studies that help to prove her argument and does not show any studies that may disprove her or bring about an opposing argument, that facts can, and do, change our minds. There was little advantage in reasoning clearly, while much was to be gained from winning arguments. When I talk to Tom and he decides he agrees with me, his opinion is also baseless, but now that the three of us concur we feel that much more smug about our views. You take to social media and it stokes the rage. Recently, a few political scientists have begun to discover a human tendency deeply discouraging to anyone with faith in the power of information. According to one version of the packet, Frank was a successful firefighter who, on the test, almost always went with the safest option. At this point, something curious happened. Friendship does. And the best place to ponder a threatening idea is in a non-threatening environment. Author links open overlay panel Anne H. Toomey. What HBOs Chernobyl got right, and what it got terribly wrong. In, Why Facts Dont Change Our Minds, an article by Elizabeth Kolbert, the main bias talked about is confirmation bias, also known as myside bias. The New Yorker publishes an article under the exact same title one week before and it goes on to become their most popular article of the week. You can get more actionable ideas in my popular email newsletter. I allowed myself to realize that there was so much more to the world than being satisfied with what one has known all their life and just believing everything that confirms it and disregarding anything that slightly goes against it, therefore contradicting Kolbert's idea that confirmation bias is unavoidable and one of our most primitive instincts. Sloman and Fernbach see this effect, which they call the illusion of explanatory depth, just about everywhere. Consider whats become known as confirmation bias, the tendency people have to embrace information that supports their beliefs and reject information that contradicts them. For lack of a better phrase, we might call this approach factually false, but socially accurate. 4 When we have to choose between the two, people often select friends and family over facts. An essay by Toni Morrison: The Work You Do, the Person You Are.. Theres enough wrestling going on in someones head when they are overcoming a pre-existing belief. In 1975, researchers at Stanford invited a group of undergraduates to take part in a study about suicide. Understanding the truth of a situation is important, but so is remaining part of a tribe. Why is human thinking so flawed, particularly if it's an adaptive behavior that evolved over millennia? Kolbert's popular article makes a good case for the idea that if you want to change someone's mind about something, facts may not help you. From my experience, 1 keep emotions out of the exchange, 2 discuss, don't attack (no ad hominem and no ad Hitlerum), 3 listen carefully and try to articulate the other position accurately, 4 show . Order original paper now and save your time! Of the many forms of faulty thinking that have been identified, confirmation bias is among the best catalogued; its the subject of entire textbooks worth of experiments. https://app.adjust.com/b8wxub6?campaign=. Change their behavior or belief so that it's congruent with the new information. Because it threatens their worldview or self-concept, they wrote. Among the other half, suddenly people became a lot more critical. The Gormans, too, argue that ways of thinking that now seem self-destructive must at some point have been adaptive. The book has sold over 10 million copies worldwide and has been translated into more than 50 languages. Humans need a reasonably accurate view of the world in order to survive. In 1975, researchers at Stanford invited a group of undergraduates to take part in a study about suicide. The Harvard psychologist Steven Pinker put it this way, People are embraced or condemned according to their beliefs, so one function of the mind may be to hold beliefs that bring the belief-holder the greatest number of allies, protectors, or disciples, rather than beliefs that are most likely to be true. 2. It's because they believe something that you don't believe. The New Yorker, Science moves forward, even as we remain stuck in place. About Press Copyright Contact us Creators Advertise Developers Terms Privacy Policy & Safety How YouTube works Press Copyright Contact us Creators Advertise . A recent experiment performed by Mercier and some European colleagues neatly demonstrates this asymmetry. And why would someone continue to believe a false or inaccurate idea anyway? Cognitive psychology and neuroscience studies have found that the exact opposite is often true when it comes to politics: People form opinions based on emotions, such as fear, contempt and anger,. Changing our mind about a product or a political candidate can be undesirable because it signals to others that "I was wrong" about that candidate or product. Select the sections that are relevant to you. People believe that they know way more than they actually do. Why you think youre right even if youre wrong, 7 Ways to Retain More of Every Book You Read, First Principles: Elon Musk on the Power of Thinking for Yourself, Mental Models: How to Train Your Brain to Think in New Ways. Your highlights will appear here. Rioters joined there on false pretenses of election fraud and wanted justice for something that had no facts to back it up. hide caption. Mercier and Sperber prefer the term myside bias. Humans, they point out, arent randomly credulous. (They can now count on their sidesort ofDonald Trump, who has said that, although he and his wife had their son, Barron, vaccinated, they refused to do so on the timetable recommended by pediatricians.). To the extent that confirmation bias leads people to dismiss evidence of new or underappreciated threatsthe human equivalent of the cat around the cornerits a trait that should have been selected against. As everyone whos followed the researchor even occasionally picked up a copy of Psychology Todayknows, any graduate student with a clipboard can demonstrate that reasonable-seeming people are often totally irrational. Why Facts Don't Change Our Minds. That meanseven when presented with factsour opinion has already been determinedand wemay actually hold that view even more strongly to fight back against the new information. Justify their behavior or belief by changing the conflicting cognition. The economist J.K. Galbraith once wrote, "Faced with a choice between changing one's mind and proving there is no need to do so, almost everyone gets busy with the proof.". Scouts, meanwhile, are like intellectual explorers, slowly trying to map the terrain with others. Most people argue to win, not to learn. Kolbert tries to show us that we must think about our own biases and uses her rhetoric to show us that we must be more open-minded, cautious, and conscious while taking in and processing information to avoid confirmation bias, but how well does Kolbert do in keeping her own biases about this issue at bay throughout her article? But I would say most of us have a reasonably accurate model of the actual physical reality of the universe. Institute for Advanced Study Government and private policies are often based on misperceptions, cognitive distortions, and sometimes flat-out wrong beliefs. What allows us to persist in this belief is other people. Enrollment in the humanities is in free fall at colleges around the country. But you have to ask yourself, What is the goal?. You read the news; it boils your blood. If you divide this spectrum into 10 units and you find yourself at Position 7, then there is little sense in trying to convince someone at Position 1. *getAbstract is summarizing much more than books. Such a mouse, bent on confirming its belief that there are no cats around, would soon be dinner. The students were told that the real point of the experiment was to gauge their responses to thinking they were right or wrong. Controversial Youll be confronted with strongly debated opinions. This website uses cookies to ensure you get the best experience on our website. Stay up-to-date with emerging trends in less time. Get book recommendations, fiction, poetry, and dispatches from the world of literature in your in-box. The best thing that can happen to a good idea is that it is shared. Participants were asked to answer a series of simple reasoning problems. This week on Hidden Brain, we look at how we rely on the people we trust to shape our beliefs, and why facts aren't always enough to change our minds. We live in an era where we are immersed in information and opinion exchange. 1. The midwife told her that years earlier, something bad had happened after she vaccinated her son. If you negate a frame, you have to activate the frame, because you have to know what youre negating, he says. Books resolve this tension. In their groundbreaking account of the evolution and workings of reason, Hugo Mercier and Dan Sperber set out to solve this double enigma.