I’m informed this ecard is “very me”, which I suppose is probably true, considering the source. But in my defense, I should like to point out that when people quite reliably fail to understand what one is saying in a very clear, precise, and articulate manner, assuming that one will have to explain oneself to the unwashed, overweight, and quasi-illiterate masses is the decent and civilized thing to do. Contra common assumptions, assuming MPAI is much more fair to one’s audience and one’s interlocutors than the assumption that everyone is capable of understanding what one is saying. While I may happen to be arrogant, that is merely a coincidence and the impression that I may occasionally appear to be talking down to people is less an indicator of that arrogance than material evidence that I am a decent and civilized individual dedicated to mutual comprehension in conversation.
However, it did remind me of this article in the New Yorker, which helps explains why so many intelligent and educated people regularly make such prodigious asses of themselves:
Perhaps our most dangerous bias is that we naturally assume that everyone else is more susceptible to thinking errors, a tendency known as the “bias blind spot.” This “meta-bias” is rooted in our ability to spot systematic mistakes in the decisions of others—we excel at noticing the flaws of friends—and inability to spot those same mistakes in ourselves. Although the bias blind spot itself isn’t a new concept, West’s latest paper demonstrates that it applies to every single bias under consideration, from anchoring to so-called “framing effects.” In each instance, we readily forgive our own minds but look harshly upon the minds of other people.
And here’s the upsetting punch line: intelligence seems to make things worse. The scientists gave the students four measures of “cognitive sophistication.” As they report in the paper, all four of the measures showed positive correlations, “indicating that more cognitively sophisticated participants showed larger bias blind spots.” This trend held for many of the specific biases, indicating that smarter people (at least as measured by S.A.T. scores) and those more likely to engage in deliberation were slightly more vulnerable to common mental mistakes. Education also isn’t a savior; as Kahneman and Shane Frederick first noted many years ago, more than fifty per cent of students at Harvard, Princeton, and M.I.T. gave the incorrect answer to the bat-and-ball question.
This doesn’t surprise me in the slightest. I’ve long observed that most smart people don’t actually want to question their core assumptions anymore than stupid people do, and they’re far more inclined to attempt to BS people into letting them skate by on a bluff. I think this may be part of what separates the superintelligent from the intelligent and it’s something that we’ve seen at work here numerous times, most recently in the discourse with the Three Pound Brain gang. It’s what I tend to think of as “the second pass”. While I’m just as susceptible to the shortcut problem as anyone else, I have a natural tendency to mentally “check my work” before answering. This doesn’t mean I’m any less biased than anyone else, or that I don’t have the usual blind spots, only that I am inclined to take the time to go and search those blind spots and see what my biases have caused me to miss on the first pass. The object is to treat one’s own mind as harshly, ideally even more harshly, than one treats the minds of others.
For example, when I looked the two problems in the article, my brain leaped immediately to the conventional wrong answer, only it noted that the answer couldn’t possibly be correct. On the second pass, I worked it out instead of attempting to justify the initial assumption, and that answer subsequently turned out to be the correct one. The trick, I think, is attuning your mind to be suspicious and look more deeply when the answer seems obvious, but something doesn’t seem quite right about it. The fact that one’s intuitive thinking is prone to these errors doesn’t meant that one’s logical thinking will be. My philosophy is that since I will probably commit errors, I might as well be the one to catch them if I can.
This, of course, is why I am so adept at setting traps for others. The reason I know where to place them is that my intuitive mind has already fallen for them.