tinfoilhaAn analysis of everything from views on climate change to perceptions of risks from GMOs or vaccines brought up one universally valid truth: people tend to believe what their friends, affiliates, or peer-group believe. Peoples’ views on these topics have less to do with any opposition or love for science, or even political affiliation (beyond climate change), and more to do with whom they identify with.

To put this all in a less scientific context, we need only think of Hastorf’s 1954 experiment in which researchers showed students from two Ivy League colleges (Dartmouth and Princeton) a video of an American football game between their representative schools in which officials made a series of controversial decisions. Asked to make their own assessments, students who attended the offending team’s college reported seeing half as many illegal plays as did students from the opposing institution. Group ties, the researchers concluded, had unconsciously motivated students from both colleges to view the tape in a manner that favoured their own school.

Dan Kahan undertook a series of studies since 2000 to further investigate this phenomenon. In one of several studies, this one from 2010, they attributed quotes about the safety or risks of vaccines and mandatory vaccination to either an older hierarchical “expert” or an “egalitarian,” and found that reversing which speaker supposedly said what literally removed the expected psychological bias (the expected bias is that individualistic people against mandatory vaccines whereas more hierarchically oriented people are in support).

Kahan concludes in his summary of these varied studies in Nature with,

Like fans at a sporting contest, people deal with evidence selectively to promote their emotional interest in their group. On issues ranging from climate change to gun control, from synthetic biology to counter-terrorism, they take their cue about what they should feel, and hence believe, from the cheers and boos of the home crowd.

But if we all evaluate evidence selectively, how can we actually come to an objective understanding? The answer lays at the root of what makes science a valid way of figuring out the world: by taking all available evidence and trying to disprove or confirm each piece beyond a reasonable doubt, we can come a factual position. How we then interpret this position, and how we apply it to related but non-identical situations, is certainly another question. If you want to know why people hold wacky beliefs, you might be wise to look at their friends.