David Brooks doesn’t know me very well.
I read his piece, and my intuition told me I hated it. The difference between us is what I did next: I read it again, then again, trying to figure out why I felt so strongly. Then, knowing that my instinctive dislike was likely to cloud my judgment, I passed it to a friend who has no particular interest in data and no stake in data science. This friend is smart, and I knew he would read what was on the page and think critically about the ideas behind it.
“Tell me what you think about this,” I said.
He read it and answered, “It’s a lot of quotes and doesn’t say anything. You hate it, don’t you?”
Yes, I hated it. I hated this piece that seemed to agree with everyone who’s extolling the value of my rare skill set. I hate it because there is nothing wrong with any individual statement in that article—except that they add up to nothing. Brooks skillfully co-opts analyses from other experts and rounds them up in a tidy little knot that demonstrates his authority without forcing him to plant his flag anywhere. Where I formed a hypothesis and looked for ways to disprove it, Brooks accepted all hypotheses at once and built a little ancedata fortress around his opinion. That’s the reasonable thing for him to do because in his world he doesn’t have to prove or disprove anything he says. Brooks is from the era where the size of his audience and the power of his amplifier automatically provide him with authority. He doesn’t have to say anything—In fact, he’s better off if he doesn’t.
If there is a danger to the data revolution, it’s people like David Brooks—people who spot a new thing, see that it’s useful, and immediately muscle in to tell the rest of us how to do it. Their personal experience reigns supreme over any experiment, any contradictory facts, any empirical evidence that they’re wrong. They yell about the superiority of their intuition, and present their good luck as proof of their prowess. These are the highest paid persons in the office. They are the self-designated “thought leaders” who lead us nowhere. They are the “experts” who prattle on and on in meetings sounding clever and offering no information. They get to that spot through various methods—some of it scheming, some of it skill at a particular thing, or just the random good luck that probability theory tells us will put someone on top no matter how level the original playing field.
More than anything, I hate this article because it commits a cardinal sin of analysis: It takes a plural of anecdotes and treats them as data. Brooks quotes 8 studies—each of which presents pages of evidence and explanation. He sums up each in a paragraph or two. He quotes no studies where poor data preparation leads to erroneous results, no incomplete models that produce spurious correlations, no poorly-interpreted analyses that led to bad decisions. That isn’t in his thesis, so he doesn’t bother with it. Like every other HIPP, he builds his case to support his position. If it’s no real position, so much the better—he can’t be accused of misleading anyone.
Brooks claims we’re telling him to “ignore intuition and follow the data.” We’re not. We’re telling him to look at all the data, not just cherry pick whatever fits his confirmation bias. We’re creating ways push all the information to the forefront so whatever contradicts his already-formed opinion can’t be ignored.
I can see why that would be scary. Brooks and people like him have told the rest of us what to think for a long time. His elevated position, obtained partly by the quality of his work but mostly by his access to a bigger microphone, gives him a certain amount of power. He’s so used to this state of affairs, he doesn’t even understand that the game has changed.
Information is becoming democratized, and people with the skill set necessary to extract meaning from that information are going to replace the David Brookses of the world. I can only hope that day comes sooner rather than later.