Physical Address

304 North Cardinal St.
Dorchester Center, MA 02124

Why I’ve removed journal titles from the papers on my CV

Can you name the journal in which microbiologist Alexander Fleming first reported on the antibacterial properties of penicillin? Or where engineer John O’Sullivan and his colleagues presented the image-sharpening techniques that led to Wi-Fi?
Most of you can easily name the benefits of these breakthroughs, but I expect only a few would know where they were published. Unfortunately, in modern scientific culture, there is too much focus on the journal — and not enough on the science itself. Researchers strive to publish in journals with high impact factors, which can lead to personal benefits such as job opportunities and funding.
But the obsession with where to publish is shaping what we publish. For example, ‘negative’ studies might not be written up — or if they are, they’re spun into a positive by highlighting favourable results or leaving out ‘messy’ findings, to ensure publication in a ‘prestigious’ journal.

Illuminating ‘the ugly side of science’: fresh incentives for reporting negative results

To shift this focus in my own practice, I have removed all the journal names from my CV. Anyone interested in my track record will now see only my papers’ titles, which better illustrate what I’ve achieved. If they want to read more, they can click on each paper title, which is hyperlinked to the published article.
I’m not alone in thinking of this. The idea for removing journal names was discussed at a June meeting in Canberra on designing an Australian Roadmap for Open Research. A newsletter published by the University of Edinburgh, UK, no longer includes journal titles when sharing researchers’ new publications, to help change the culture around research assessment. Celebrating the ‘what’ rather than the ‘where’ is a great idea. This simple change could be extended to many types of research assessment.
It is disorienting at first to see a reference that does not contain a journal title, because this bucks a deeply ingrained practice. But journal names are too often used as a proxy for research excellence or quality. I want people reading my CV to consider what I wrote, not where it was published, which I know is sometimes attributable to luck as much as substance.
Of course, anyone who really wants to judge me by where I’ve published will simply be able to google my articles: I haven’t anonymized the journals everywhere. But removing the names in my CV discourages simplistic scans, such as counting papers in particular journals. It’s a nudge intervention: a reminder that work should be judged by its content first, journal second.
Because I’m a professor on a permanent contract, it’s easier for me to make this change. Some might think that it would be a huge mistake for an early-career researcher to do the same. But there is no stage in our scientific careers at which decisions about hiring and promotion should be based on the ‘where’ over the ‘what’. It would be easier for early-career scientists to make this change if it became normalized and championed by their senior colleagues.
A potential criticism of removing journal names is that there is nothing to stop unscrupulous academics from publishing shoddy papers in predatory journals to create a competitive-looking CV, which could put candidates with genuine papers at a disadvantage. Promotion and hiring committees need to be made aware of the growing problem of faked and poor-quality research and receive training on how to spot flawed science.
However, when a job gets 30 or more applicants, there can be a need for short-cuts to thin the field. I suggest that reading the titles of each applicant’s ten most recent papers would work better than any heuristic based on paper counts or journal names, for only a slight increase in workload.
Imagine a hiring or fellowship committee that receives plain or preprint versions of the every applicant’s five best papers. Committee members who previously relied on simplistic metrics would have to change their practice. Some might simply revert to Google, but others might welcome the challenge of judging the applicants’ works.
Judging researchers is much more difficult than counting impact factors or citations, because science is rarely simple. Simplistic promotion and hiring criteria ignore this wonderful complexity. Changing typical academic CV formats could bring some of it back.

en_USEnglish