Negative Psychology

The atmosphere of wary and suspicious disbelief

Jim Coan's Circle of Willis
11 min readMay 30, 2014

“The critical scrutiny of all scientific findings—perhaps especially one’s own—is an unqualified desideratum of scientific progress. Without it science would surely founder—though not more rapidly, perhaps, than it would if the great collaborative expertise of science were to be subjected to an atmosphere of wary and suspicious disbelief.” — Peter Medawar

People on all sides of the recent push for direct replication—a push I find both charming and naive—are angry. Last week at APS (the Association for Psychological Science 2014 annual meeting) I heard colleagues—often personal friends—characterized as either methodological simpletons or inscrutable bullies. It’s sad not least because all involved are collegial, funny, and reasonable in person. As far as I know, the most visible proponents of direct replication have a respectably nuanced view of why replications may fail, which is to say, for all kinds of reasons. On the other hand, who doesn’t think “bad research” when a finding apparently fails to replicate? And why is that?

I think psychology is suffering from an attitude problem—a burgeoning trend I’m calling Negative Psychology. Just as Positive Psychology implies a belief that focusing on strengths and virtues will enhance well being, Negative Psychology implies a belief that increased wariness and suspicion will enhance scientific progress, a perspective with which I wholeheartedly disagree. But Negative Psychology also encompasses that suite of behaviors—public ridicule and shaming, moral outrage, clumsy humor—that the internet has a tendency to encourage. The main proponents of Negative Psychology are methodologists with whom I feel a strong professional affinity. I’ve been hanging out with methodologists for nearly two decades now, and though Negative Psychology has always been a feature of my clan, the internet—and particularly social media—is turning what used to be an unfortunate foible into a professional phenomenon.

Despite limited returns on ample investments, I’ve long participated in methodological work of my own. Indeed, I seem incapable of avoiding it. My graduate school years were steeped in the felicitously named Evaluation Group for Analysis of Data (EGAD), founded by Lee Sechrest, who chaired my Ph.D. minor in methodology, and who is no slouch in everything from the philosophy of science to multivariate statistics. Lee fostered a view of methodology I could enjoy. In EGAD, we spent little time wagging fingers and furrowing brows. Instead, we experimented with unorthodox research designs and data analytic procedures, pushing methodological boundaries and taking interesting risks.

Alas, EGAD dabbled in Negative Psychology, too. When this happened, otherwise compassionate and thoughtful individuals would behave badly, neglecting any benefit of the doubt and exercising humor less funny than mean. And once it all got started it was contagious—I think because suspicion and snark got confused with rigor. Since the appearance of rigor is highly desirable and suspicion and snark are easy, suspicion and snark became the path of least resistance to looking rigorous. Indeed, I think the same process is the likeliest cause of Negative Psychology now.

Once it gets started, it can be hard to reign in. Judging from popular media coverage, snarky “tweets,” and interminable Facebook exchanges, Negative Psychology is steeply on the rise. The abysmal state of science is emphasized, with putatively dubious findings confronted in emotionally charged, broadly humorous, and decidedly negative terms.

Negative Psychology assumes the worst—the worst in methodology, the worst in social impact, the worst in motivation—when evaluating a scientist or a scientist’s work. You see these assumptions in “voodoo correlation” claims, “p-hacking” investigations, websites like Retraction Watch, Neuroskeptic, a handful of other blogs devoted to exposing bad science, and a collection of social media users on Twitter, Facebook and elsewhere. Writers working in Negative Psychology mode claim various rationales, but public shaming is surely at the top of the list. Some have made that explicitly clear.

This is bad for our colleagues because it orients them toward defensive rather than creative thinking; it’s bad for the public because it needlessly degrades confidence in the best research right along with the worst by implicitly equating the two; and it’s bad for the critics themselves because it tempts us all to stop listening. Indeed, Negative Psychology risks conflating moral outrage with scientific rigor, a confusion that is compelling, I think, to both laypeople and scientists in training, because moral outrage is relatively easy, and scientific rigor is not. Lacking critical substance, our students may learn that a cranky demeanor will do well enough, and that a posture of moral outrage can fast-track them to the steely-eyed methodologist club.

Why Negative Psychology is bad for our colleagues.

Do your own survey of classic work in psychology—the stuff that really moved the field forward in meaningful, generative ways—and ask yourself how well that work would stand up to critical examination today. Here are some that come to mind for me: Schacter and Singer, 1962; Festinger, Riecken, & Schachter, 1956; Ekman & Friesen, 1971; Hull, 1943; Eysenck, 1953; there are many others. I bet there isn’t a direct replication in the bunch. Worse, I bet we’d rather not try. These studies were not influential because they were airtight methodologically. Indeed, methodologically speaking, they were deeply flawed. Instead, these works were influential because they contained very creative ideas. I’m not about to argue that we need more methodologically flawed research, but I do think that fear of public shaming will serve as a disincentive to push theoretical and methodological boundaries. Moreover, I think there is a negative correlation between creativity and rigor, driven not by the conditions necessary for creativity, but by the conditions necessary for the kind of rigor we tend to emphasize in psychology—the fear of Type I error in particular.

To paraphrase H.L. Menken, the great scientists of the world are rarely Puritans. People want to do science for all kinds of reasons and in all kinds of ways, only some of which have anything whatever to do with experimentation. And yes, some scientists (not you of course) will even occasionally cut corners and make sloppy mistakes in their hotheaded pursuit of cool ideas. If that’s not part of the game, the game is not going to get played. At least, the game won’t get played well.

And anyway, fear of public shame is already part of the game, too, as we all know. Any of us who discovered a retraction worthy mistake would feel deeply humiliated and embarrassed. There is probably no way around this, nor, probably, should there be. Whenever a person’s experiment fails or they find that they’ve made a dumb mistake, or their work fails to replicate, we can expect them to feel ashamed without any help from us. If anything, our job should be to help them feel better when things go wrong. We lend our support because they (and we) are good people, and because one day the guilty party might be us. (Just kidding. Failure will never happen to you.)

We should also remember that shame causes people to go crazy. This is true under the best circumstances. When people expect to be shamed publicly, they will do just about anything to avoid it. When we pile on, we make a bad situation worse. People will dig in their heels, become defensive, level counter-accusations, etc., as predictably as night follows day. This isn’t the behavior of methodological cretins. This is the behavior of normal people. And scientists are normal people.

Why Negative Psychology is bad for the public.

The public depends on us to be reasonable. They are watching us not only for cues about what to believe, but also to understand the scientific process. They are watching the discourse, and so are our students. When we criticize each other using the tropes of Negative Psychology—that is, with moral outrage, hostile humor, and public shaming—we train the public to either disregard science altogether, or (again) to confuse outrage with rigor.

I don’t think this is just speculation on my part.

Readers may know about an anonymous blogger called the Neuroskeptic. the Neuroskeptic’s mission is to offer criticism of popular neuroscientific work on the grounds that such work is often difficult for the public to interpret. Last December (2013), the Neuroskeptic posted a blog encouraging “scientific vigilantism.” Vigilantes would use blogs and social media to smoke out fraud, and raise the alarm in a publicly accessible way outside of the traditional peer review process.

I read the piece soon after it was posted. Here is what the very first comment said:

When this happened years ago in climate science, we ‘vigilantes’ were all labelled as evil deniers on the pay of evil corporations trolling to destroy the natural world, the planet and civilization itself.

Nice to see the correct, skeptical attitude to science is instead spreading to other fields. As it should.

What could the Neuroskeptic do, except hastily reply:

That’s a ridiculous comparison. I’m talking about spotting formal scientific misconduct, not criticizing scientific theories. But it doesn’t matter really because, since the BEST study, even evil deniers on the pay of evil corporations have concluded that global temperatures are rising.

I feel for the Neuroskeptic here. It’s kind of a “gotcha” moment. But the comparison is not ridiculous. Indeed, it is apt. One lesson is clearly stated, even encouraged: anyone can be a “skeptic,” expert or not. Another lesson is accidentally implied: social media, snark and outrage are all a skeptic really needs.

First comment on the Neuroskeptic’s call for “Scientific Vigilantism”

Because Negative Psychology makes such hay from snark and outrage, and because fraud of the kind the Neuroskeptic is worried about is rare, the criteria rendering a paper or scientist the target of snark and outrage will have to be relaxed. This is for much the same reason that new grant money becomes especially important after—not before—a first major grant is acquired: One has to “feed the beast.” Because real rigor is difficult, but snark and outrage are easy, it’s a trivial thing for Negative Psychology to morph into pseudocriticism, which I submit it is well on the way to doing. Pseudocriticism is a cousin of pseudoscience. Both adopt the superficial trappings of science without the substance. As with pseudoscience, the lay public are ill equipped to evaluate the claims of pseudocritics. But the snark and outrage are readily interpretable. Real scientists are stern, we learn, even angry. They put their feet down, draw hard lines in the sand, talk in definitive terms, etc.

In the end, Negative Psychology equates—for the public, at least—poor methodological habits, run-of-the-mill scientific sloppiness, innocent probabilistic error, injudicious hype, and outright fraud. In practice and effect, it can be reminiscent of the Golden Fleece Award that did so much damage to Psychology in the 1970s and 80s.

Why Negative Psychology is bad for critics.

The foregoing may give the impression that Negative Psychologists have nothing of value to say. That’s nonsense of course. The representatives of Negative Psychology are terrific people, and thoughtful too, if not brilliant. As I said before, Negative Psychology is contagious. It’s also habit forming. The habit of Negative Psychology dilutes serious criticism by shifting attention from content to snark and outrage. And it’s the snark and outrage that tempts the rest of us—fellow scientists now—to stop listening. Several practitioners of Negative Psychology have marginalized themselves this way already.

This is partly out of habituation (“oh, there goes [redacted], mouthing off again”). But another part of it—a big part—is that criticism should be thoughtful and interesting (and, indeed, entertaining if possible). Criticism of the Negative Psychology type starts out interesting, often funny, outrageous, and fascinating (in the way that car wrecks are fascinating), but it doesn’t take long for it to become boring, dreary, and priggish—at least if you're a real scientist trying to locate the critical content.

Which brings up yet another point. I am not arguing against criticism per se (I hope that much is obvious). Nor do I necessarily think the expression of anger and irritation is in itself always or even particularly destructive. Indeed, some of the best published criticism I know of is artfully angry (see Why I Don’t Attend Case Conferences by Paul Meehl). I’m not advocating for a passionless, procedural approach to criticism and counter-criticism, although a reasonable attempt at decorum is a good idea.

Nor do I think criticism should never be funny, though being funny is risky because—all due respect—most scientists aren’t funny enough to meld humor and criticism effectively. Instead, jokes come across as clumsy and insensitive and even a little abusive. In my informal survey of Negative Psychology over the past two years, I’ve seen more than a few bloggers and social media participants characterize themselves as being “light-hearted,” or “irreverent,” which seems to them to justify saying some pretty ugly things—things we are expected to experience as funny. But critical irreverence requires a level of skill—art, really—that is sadly lacking in pretty much everyone I know, not excepting myself. Mark Twain was irreverent. H.L. Menken was irreverent. At its worst, “irreverence” in Negative Psychology really does resemble a kind of bullying—one that cloaks garden variety abuse in the dubious language of research ethics. It drags the conversation down into a who-hurt-whose-feelings form of meta-communication that accomplishes little except for placing the critic in the forefront of everyone’s attention. The end result is predictable: A collective, often unspoken consensus that the bathwater needs throwing out.

How shall we proceed?

I liked Daniel Kahnemann’s A New Ettiquette for Replication essay very much, but less for the specific recommendations he enumerated than for his frank talk about scientific egos and research method sections. In response to this essay, I’ve seen the predictable outrage expressed on Twitter and Facebook—comments suggesting that science is actually endangered by Kahnemann’s sentiments and that method sections should always be detailed enough for others to replicate a study without any communication with the original author. I find the latter objection particularly interesting, because it seems so damning on the face of it. Of course method sections should be sufficient.

But Kahnemann is absolutely correct. Methods sections are rarely comprehensive, and probably shouldn’t be. To assert this is, in the current environment, something close to heresy, but a short time ago it wouldn’t have been controversial. And it isn’t as though this “problem” is unique to psychology. I have just finished reading the autobiography of Peter Medawar, Memoir of a Thinking Radish (a book I very highly recommend to anyone interested in how the science sausage is made). Medawar, some readers will know, received a Nobel Prize for his work on immune tolerance in organ transplants. His life story is replete with examples of bench scientists visiting each other’s laboratories to learn precisely how a particular technique is achieved, to be guided by hand as it were. Method sections should be sufficient to basically evaluate the soundness of a study, and to raise important flags, but probably any attempt to comprehensively provide every detail for a replication—or a complete understanding—will fall short.

So here is one broadly generalizable idea: let’s actually, literally, talk to each other. And talk not only for the purpose of accurate replication, but also when the impulse arises to publicly criticize. Friend and fellow EGAD alum Patrick McKnight has also suggested that we should collaborate more and more often—that indeed we need to find better ways to reward collaborative problem solving instead of individual paper production.

Ultimately, I don’t have any specific recommendations or guidelines to offer. I’m a bigger fan of principles than of rules in any case. In principle, we should work hard—harder than is required—to be generous, respectful, honest, and as clear as we can be.

I opened this essay with a quote from Peter Medawar, and I’ll close with one, too—taken from the memoir I mentioned above. It expresses simultaneously the risks and promises of scientific life, and contains within it the seeds of compassionate respect we should all hold for one another. More than that, it celebrates the creativity that the “atmosphere of wary and suspicious disbelief” fostered by Negative Psychology threatens to squelch.

“…all scientists who are in the least imaginative will sometimes take a wrong view and waste time pursuing it. This has to be rated an occupational hazard of the scientific life. On the other hand the scientist too scared to speculate boldly can hardly be said to be having a creative life at all, and will end up like one of those sad, sterile men of letters whose taste is so refined and judgement so nice that they cannot bring themselves to the point of putting pen to paper.”

--

--

Responses (6)