Science in our Hearts
I recently read an article that really helped make sense for me out of a phenomenon I've seen all around me recently. I think of it as the hallmark of internet-based discussion: people with opposing positions on a topic can read the same fact-packed article on that topic and come to opposing conclusions about the validity of the facts it contains. It happened with me and some of my friends about the Planned Parenthood "sting" videos that came out in late January this year. When i heard about the videos, I immediately connected them in my mind to the ACORN videos that were used to shutter the voter-registration organization. After a detailed investigation of the video, it was shown to have been edited to smear ACORN, which was later exonerated. So I assumed that this Planned Parenthood video was more of the same. People with an anti-Planned Parenthood bias assumed it was representative of business as usual at Planned Parenthood. At the end of the day, I was pleased that the one clinic manager was fired, as it appears she acted heinously and inappropriately, and that Planned Parenthood reported what it thought might be a sex-trafficking ring to federal authorities for investigation. At the end of the same day, my friends thought it wasn't enough to fire the one clinic manager, because they took her not as an outlier, but as a representative example of the group. They also thought that the report to the FBI came AFTER the sting video was released, as a defensive move, and not as one motivated by actual concern for the health of potential victims of sex trafficking. We were all reading the same articles, we all saw the same events unfolding. It reinforced my belief that I can trust Planned Parenthood (most of the time) to do the right thing. It reinforced their beliefs that they cannot trust Planned Parenthood (most of the time) to do the right thing.
It turns out that that's the way the human mind is programmed to work. According to the article, "It would seem that expecting people to be convinced by the facts flies in the face of, you know, the facts." And that, my friends, is the heart and soul of internet-based discussions, comment wars, flame wars, and bulletin boards. Once you've established that you disagree with someone, they can say anything they want, and you're unlikely to listen to any of it, except to rationalize why it's wrong, to refute their facts, and to question the validity of their sources. It turns out that the well-educated are even more susceptible to this. Those who don't know much about a topic, but have strong feelings about it anyway, tend to be slightly more amenable to changing their minds when presented with facts about the topic. Those who know a lot already tend to use their education to pick apart the science, even when the science is good.
The article is full of fascinating examples of how and why exactly this stuff happens, from the Iraq/Al Qaeda link to the Vaccines/Autism link, and especially regarding climate change. It turns out that if you want someone to change their mind on a topic, you not only have to approach them with facts, but you have to present the facts wrapped in values that person already holds.