Skip to content

She’s wary of the consensus based transparency checklist, and here’s a paragraph we should’ve added to that zillion-authored paper

Megan Higgs writes:

A large collection of authors describes a “consensus-based transparency checklist” in the Dec 2, 2019 Comment in Nature Human Behavior.

Hey—I’m one of those 80 authors! Let’s see what Higgs has to say:

I [Higgs] have mixed emotions about it — the positive aspects are easy to see, but I also have a wary feeling that is harder to put words to. . . . I do suspect this checklist will help with transparency at a fairly superficial level (which is good!), but could it potentially harm progress on deeper issues? . . . will the satisfactory feeling of successfully completing the checklist lull researchers into complacency and keep them from spending effort on the deeper layers? Will it make them feel they don’t need to worry about the deeper stuff because they’ve already successfully made it through the required checklist?

She summarizes:

I [Higgs] worry the checklist is going to inadvertently be taken as a false check of quality, rather than simply transparency (regardless of quality). . . . We should always consider the lurking dangers of offering easy solutions and simple checklists that make humans feel that they’ve done all that is needed, thus encouraging them to do no more.

I see her point, and it relates to one of my favorite recent slogans: honesty and transparency are not enough.

I signed on to the checklist because it seemed like a useful gesture, a “move the ball forward” step.

Here are a couple of key sentences in our paper:

Among the causes for this low replication rate are underspecified methods, analyses and reporting practices.

We believe that consensus-based solutions and user-friendly tools are necessary to achieve meaningful change in scientific practice.

We said among the causes (not the only cause), and we said necessary (not sufficient).

Still, after reading Megan’s comment, I wish we’d added another paragraph, something like this:

Honesty and transparency are not enough. Bad science is bad science even if it open, and applying transparency to poor measurement and design will, in and of itself, not create good science. Rather, transparency should reduce existing incentives for performing bad science and increase incentives for better measurement and design of studies.


  1. Norman says:

    Re transparency is not enough, have you seen this post: “What’s Wrong with Social Science and How to Fix It: Reflections After Reading 2578 Papers”? This will give you the flavour:

    There’s a popular belief that weak studies are the result of unconscious biases leading researchers down a “garden of forking paths”. Given enough “researcher degrees of freedom” even the most punctilious investigator can be misled.

    I find this belief impossible to accept. The brain is a credulous piece of meat but there are limits to self-delusion. Most of them have to know.

  2. Michael Nelson says:

    I am unconcerned that there will be very many scientists who, despite being either too ignorant or too careless or too unscrupulous to do quality science, are so excited about the prospect of doing transparent science that they decide to go find and voluntarily adopt an obscure checklist, to add it to the countless other forms they have to fill out during the funding/research/publication processes, to follow it accurately and complete it honestly, only to then use it as an excuse for not doing quality science. A bad checklist is not the weak link in the causal chain between bad scientists and good science.

    • Michael Nelson says:

      Put another way: If you could tempt people into being transparent by letting them use transparency as an excuse for poor quality, that alone would justify the existence of the checklist.

    • jim says:

      “A bad checklist is not the weak link in the causal chain between bad scientists and good science.”

      Well said!

    • gec says:

      > they decide to go find and voluntarily adopt an obscure checklist

      I’m not so sure, actually. If the checklist is touted as “this will make your science kosher” and it gets you a badge from the OSF, I think a surprising amount of people will rush to adopt it without really appreciating the meaning behind it.

      We’ve already seen this with pre-registration: A lot of folks were willing to buy the idea that filling out a form before doing a study protects them from any criticism and ensures that their results will replicate.

      More broadly, as you say, science is largely built around checklists anyway. That’s how bad science gets to masquerade as science, because there are enough folks out there who are really good at following the letter of the law rather than the spirit (Bem being an obvious example). They know that if they just follow the rules, they get to say whatever they want in the end (“I ticked these marks, therefore ESP”).

      To someone who is already really good at mindlessly following rules, what’s another checklist if it lets them keep playing the same game?

      • Martha (Smith) says:


        I suspect that the appeal of “checklists” is in part the desire to have clearcut rules that guide you. This may be related to the appeal of the KISS principle — to which I often think the appropriate response is IASS (“It ain’t simple stupid” — although the pronunciation of the acronym might serendipitously apply as well).

      • jim says:

        “I’m not so sure, actually….”

        I’m sure for some people the game is to check the boxes and do as little as possible. There will always be people like that, even in science (perhaps that’s the greatest myth of science: that scientists defy the averages of the frailty human integrity). But putting the boxes there at least forces them to navigate a more visible and challenging course.

  3. Tom Davies says:

    Off topic, Andrew, this professor has discovered that 22 minutes less sleep can make you unethical:

    So make sure you keep track of how much sleep your less than ethical colleagues get and make sure that you get a bit more!

    Of course, perhaps unethical people just lie awake at night, tortured by their misdeeds.

    • Andrew says:


      Wow—so if that Why We Sleep guy had just followed his own advice, maybe he wouldn’t have had the time to make those misleading graphs!

      • AllanC says:

        Aside from the rather unsupported causal interpretation the study’s author has decided to champion I fail to appreciate how the graphs are misleading. For example, the “Rest Less, Cheat More” graphic appears to be an accurate display of the data at hand. The title is questionable, sure, but the graphic actually shows at face value the very negligible difference of sleep between the two subgroups. I mean, it seems to me that most would look at the bar graph and go “meh those look about the same”…am I missing something here?

        • Tom Davies says:

          I think the graph would be better if it showed the distribution of sleep times for the cheaters and non-cheaters. As it is we get the impression that 22 minutes less sleep and you are sure to be a cheater tomorrow. “there was a difference of only about 22 minutes of sleep between those who cheated and those who did not” — no mention of averages.

          But I think the graphs Andrew means are the sporting accidents vs hours of sleep graphs from Why We Sleep (if I’m remembering correctly).

  4. Han Geurdes says:

    Thanks. Honesty and transparency are a necessary first step in publishing. We need this both on the author and reviewer side. Without it publication is pointless.

    But what is good and bad in science is simple. If one can verify the claim made independently. E.g. in mathematics do the math yourself as a reader, run the computer program etc. In experimental science … do the experiment. But if there is only one lab in the world that could do it,then the claiming authors should help the critics to perform a replication.

  5. JDK says:

    Honesty and transparency are not enough. Bad science is bad science even if it open, and applying transparency to poor measurement and design will, in and of itself, not create good science.

    Just noticed this quoted on Retraction Watch!

  6. CK says:

    Honesty and transparency are not enough; however, they do allow one to make a basic judgment about the scientific quality of the article.

    • Andrew says:


      I disagree. As one commenter put it, one of the key rationales for honesty and transparency is that these make it easier to recognize that an article is of low scientific quality. When the window is transparent, you can see into the house. A transparent window does not imply a clean house; it implies that you can look into the house and see how clean it is.

Leave a Reply