Uncategorized

Effects of careerism in biomedicine

Productivity in research is usually measured in terms of number of publications, number of citations, journal impact factors, etc. — and these measures are in turn a major precondition for securing research grants. But perhaps we’re measuring the wrong things, asks Kent Anderson (the scholarly kitchen). Perhaps we should measure instead how many results have been replicated. He […]

Kiel Johnson's Publish or Perish (2009)

Productivity in research is usually measured in terms of number of publications, number of citations, journal impact factors, etc. — and these measures are in turn a major precondition for securing research grants.
But perhaps we’re measuring the wrong things, asks Kent Anderson (the scholarly kitchen). Perhaps we should measure instead how many results have been replicated.
He asks this because there is a raising concern among drug producers that the published scientific literature is unreliable. For example, he quotes Bayer for saying the company has halted almost two-thirds of its early drug target projects because they cannot replicate the results in the published literature. Even many data in the most prestigious journals couldn’t be confirmed.
It’s not necessarily a question of fraud. It can also be the result of exaggeration, wishful thinking and cherry-picking of results. Which in turn is sustained by the ubiquitous publish-or-perish attitude in scientific institutions. The combination of careerism and neo-liberal managerial thinking in universities is a dangerous cocktail.