Recently I was attempting to estimate the probability that we are living in a simulation. Watching my brain's natural attempt to estimate the probability of this eerie abstraction, I saw that it seemed to do two things: (a) poll my mental models of people I thought of as smart who had considered the question; and (b) discount that percentage by the high percentage of smart people I know who haven't considered the question (though this second is really about my confidence in my assignment of probability, rather than the probability itself - the hash I see my brian performing is to adjust the probability toward the status quo in response to a drop in confidence of a strange assertion).
I have no idea if my introspective experience† is in any way univesal, but it does seem that polling one's epistemic peers is an excellent means for estimating probabilities. This is basically like looking things up on Wikipedia, which is also a highly rational action, but polling one's own personally vetted epistemic peers removes some of the uncertainty.
Noticing how I use my mental models of my epistemic peers in thinking about things caused me to notice how startling the effect of a single epistemic peer can be on my confidence in widely accepted beliefs. When I get to know someone and find that I have no choice but to grant that his brain works at least as well as mine, I have no choice but that any strange belief he holds alters my confidence in the commonly held belief.
One way that we protect important beliefs is to by definition exclude those holding opposing beliefs as epistemic peers. If we refuse to admit them into polite society, they can't harm the stable, practical ways of thinking that we have developed.
What this suggests to me is, to get your strange belief accepted by a wider audience, it's much more effective to establish yourself as an epistemic peer of a wide, influential group than it is to develop "convincing" arguments for your strange belief. The existence of an epistemic peer who holds a contrary belief is more devastating than any argument.
What this also suggests is that if we are really interested in the truth, we will surround ourselves with epistemic peers who hold beliefs as different from ours as possible, and try to figure out why similar brains have come to hold such different beliefs. We will counteract our social belief-protection systems.
I am anxious to test this - if anyone knows any 150+ IQ evolution deniers, please send them my way! (Basically I think this is how Chip Smith lives his life.)
† As usual, I would like to thank marijuana for its important contributions to this post.