Show me the data!
Header

Comparing OUP to other publishers

January 25th, 2017 | Posted by rmounce in Paywall Watch - (5 Comments)

Any good scientist knows that one must have an adequate experimental control when trying to determine the significance of effects.

Therefore, in order to test the significance of the 106 broken DOIs I reported at OUP yesterday, I created a comparable stratified ‘control’ sample of 21 journals NOT published by OUP that are indexed in pubmed. These 21 are published by a variety of different publishers including PLOS, eLife, NPG, Taylor and Francis, Springer, and PeerJ.

I used the exact same method (screen-scraping pubmed to get the 100 most recent items published at each journal) and checked all of the resulting 1605 DOI URLs that I obtained from these 21 journals (not every item listed in pubmed has a DOI). For the control group of non-OUP journals, I found just 7 broken DOIs. So just 0.4% (to 1 d.p.) of recently minted DOIs at other journals are broken. This is in stark contrast to the >6% failure rate at OUP.  I think it’s fair to say OUP has a significant problem!

 

The 21 journals included in the OUP set are: 

J Anal Toxicol; FEMS-Microbiology-Ecology; Journal of Heredity; Medical Mycology; Bioinformatics; FEMS-Microbiology-Letters; Journal of Medical Entomology; Mutagenesis; Brain; FEMS-Yeast-Research; Journal of Pediatric Psychology; Briefings in Bioinformatics; Briefings in Functional Genomics; Journal of the Pediatric Infectious Diseases Society (JPIDS); Pathogens and Disease; Glycobiology; Clinical Infectious Diseases; Systematic Biology; Evolution Medicine & Public Health; Journal of Biochemistry; Molecular Biology and Evolution

The 21 journals in the non-OUP set are:
Academic Radiology; Appetite; Neurological Research; Acta Neuropathologica; Autoimmunity; Nutrition and Cancer; Addictive Behaviours; British Journal of Nutrition; PeerJ; AIDS Care; Diabetologia; PLOS ONE; Alcohol; eLife; PLOS Pathogens; Annals of Anatomy; Heliyon; Psychological Medicine; Antiviral Research; Journal of Medical Systems; Scientific Reports

 

Full logs evidencing the data used in these analyses and the DOIs of each and every article checked are available on github: https://github.com/rossmounce/Checking-OUP-DOIs

Continuous Monitoring

This analysis was hastily done. By using the pubmed or EuropePMC API I could actually script-up some weekly monitoring of ALL journals indexed in pubmed to produce weekly reports like this, ranking each and every publisher in terms of DOI performance. I could do this. But I’m hoping Crossref will publish these simple statistics instead. The scholarly community needs to know this kind of information. I’m hoping it will shame some publishers into improving their practices!

Updated 2017-02-01: Mathematical equation rendering failures spotted at the journal ‘Molecular Biology & Evolution’ (MBE). Added to the lengthy list.

In this post I shall try and summarise the different types of error that are occurring across Oxford University Press (OUP) journals at the moment.

It appears OUP have changed their underlying platform software this year, and that they haven’t done enough testing before putting it into production. The variety of different errors encountered is truly astonishing.

1.) Missing Articles

As documented yesterday with an example, OUP have failed to do the most basic task of a publisher: preserve access to paid-for subscription content. 24 hours later after I reported it missing, the Bayes Factor article is now available, but the DOI URL (http://dx.doi.org/10.1093/sysbio/syw101) still doesn’t resolve to it. Speaking of which…

2.) Paywalling Open Access Articles (update 1) 

Oddly OUP have managed to paywall an article at the normally fully open access journal ‘Nucleic Acids Research’: the article ‘A novel method for crosstalk analysis of biological networks: improving accuracy of pathway annotation‘ appears to be inaccessible at OUP’s site. Additionally, through Rightslink, they are selling the re-use rights to this article. To determine if this was real or not I made a test purchase, specifying that I wanted to re-use this article in a non-commercial setting, in a presentation. I was charged and paid 42.14 GBP for the right to re-use 1 page of this article in an educational, non-commercial presentation. You can see a screenshot of my receipt for this rights purchase here.

3.) Broken DOI’s that don’t resolve to article landing pages

DOI’s are an integral part of modern 21st century publishing infrastructure. They are supposed to be reliable, persistent links to content. I tested 1735 recently minted DOI’s across 21 different journals published my OUP that are indexed in PubMed. The log files to provide full evidence of my testing are available on github. When a DOI fails to resolve to an article landing page it gives a 404 error. I found that 106 (over 6%) of the recently minted DOI’s I examined gave 404 errors. Remarkably 82 of these failures all come from article DOIs at one journal: the Journal of Medical Entomology.

4.) Editors appearing (erroneously) listed as additional authors of papers

I haven’t observed this myself, but apparently keen eyes at Systematic Biology have spotted this occurring to some article pages.

5.) Journal Articles Appearing as Published by a Totally Different Journal

Yesterday I found that 15 Systematic Biology articles appear to be published in the “Logic Journal of the IGPL” (as of today, I think some have been fixed and inevitably they will all get fixed eventually, so I have a screenshot below to prove it)

 

6.) Unexpected lack of indexing in PubMed

I happen to really like the journal Gigascience – they unfortunately decided to move from BioMedCentral to publishing with OUP starting this year, and they seem to have been hardest hit by the problems at OUP. For unknown reasons it is readily apparent that PubMed hasn’t indexed any Gigascience articles since November 2016! See for yourself: https://www.ncbi.nlm.nih.gov/pubmed/?term=%22Gigascience%22%5Bjournal%5D

This is a really serious problem. If I was an author of a recent Gigascience article I would be furious about this. Recent articles there are completely invisible to literature searches performed at PubMed. This has affected 49 articles in the December Issue (Vol 5 Issue 1), as well as 15 advance access articles that haven’t been assigned an issue yet. Hundreds, perhaps thousands of authors are affected by this. If I were OUP I would make this bug the highest priority to fix. 

7.) Mathematical equations failing to render in the HTML (on any browser) [update 2]

As spotted by Brian O’Meara and independently confirmed by Joseph Brown. See below for an example:

 

8.) Article landing pages with no article title or authorship details visible

This bug is affecting articles at Evolution, Medicine & Public Health, Gigascience, Nucleic Acids Research and probably more. I’m certain it is not a ‘deliberate’ style choice.

9.) Some DOIs redirecting to placeholder PDFs (instead of actual content)

10.) Article Views data appears to have been reset to zero

 

I note that OUP have put out a statement to “apologize sincerely” for these issues. But I am not convinced a mere apology is enough compensation when many of the errors remain unfixed.

I call upon libraries, authors of recent articles in OUP journals, and academic societies that publish with OUP to seriously consider taking further action about this matter. Many of these problems have been present at OUP journals since at least January 13th 2017. OUP have been incredibly slow to identify and fix these problems and many of them should not have been problems in the first place – completely avoidable with adequate testing.

Tomorrow I will assess the situation again and update with any new reports of errors or action taken.

This morning, a PhD student asked me if I could get access to copy of:
“Bayes factors unmask highly variable information content, bias, and extreme influence in phylogenomic analyses” by Jeremy M Brown and Robert C Thomson which was first published online (ahead of print) on 20th December 2016. DOI: http://dx.doi.org/10.1093/sysbio/syw101

The student urgently needs access to this work because it relates very closely to some of his research and he has a manuscript in the final stages of preparation doing something similar or related to this work.

As of 23-01-2017, this paper is seemingly completely missing from OUP’s new website (they appear to have migrated all journals to this base URL now: https://academic.oup.com ) and they have failed to put in place any redirect links that resolve to where this article is, if it is online at all. This paper may have been missing/offline/unavailable since January 13th 2017 – remember it has not appeared in print yet, thus it is only electronically available.

Old links to it that used to work include:
http://dx.doi.org/10.1093/sysbio/syw101
http://sysbio.oxfordjournals.org/content/early/2016/11/14/sysbio.syw101.abstract

The society itself knows this article exists, it tweeted about it:

A third-party website also acknowledges the existence of this article:
http://www.pubpdf.com/pub/28003531/Bayes-factors-unmask-highly-variable-information-content-bias-and-extreme-influence-in-phylogenomic-

Sci Hub preserves access to paid-for scholarly content, when the original publisher fails to do so

Interestingly, Stian Håklev alerted me to the fact that the full text of this otherwise missing paper is available via Sci Hub https://twitter.com/houshuang/status/823478936030052352

Direct Sci Hub link to this paper here: http://dx.doi.org.sci-hub.cc/10.1093/sysbio/syw101

It is deeply ironic that my only available access to an article that my library (and thousands of other libraries and personal subscribers around the world!) has paid a publisher to make available is at a so-called “pirate library” like Sci-Hub. Why do we pay large sums to legacy publishers for incompetent service provision, whilst our libraries pay nothing to competent, low-cost archival services like Sci-Hub? “Lots of copies keeps things safe” as they say.

Final questions…

HOW MANY OTHER PUBLISHED PAPERS ARE NOW “MISSING” (NOT AVAILABLE ONLINE) AT JOURNALS PUBLISHED BY OXFORD UNIVERSITY PRESS? IT STRIKES ME THAT THIS PROBABLY ISN’T AN ISOLATED CASE – IT SEEMS THAT OUP HAVE NOT USED ROBUST AUTOMATED PROCESSES TO MIGRATE CONTENT AND CREATE APPROPRIATE REDIRECT LINKS.

BUT EVIDENCE OF ABSENCE IS HARD. WE HAVE ONE ROBUSTLY EVIDENCED CASE HERE BUT THERE ARE PROBABLY MORE.

PLEASE HELP FIND MISSING ARTICLES SO WE CAN ASSESS THE TRUE SCALE OF THE LOSS OF SERVICE HERE.

LIBRARIES AND INSTITUTIONS AROUND THE WORLD PAY SUBSCRIPTIONS TO HAVE ELECTRONIC ACCESS TO THIS CONTENT. THIS IS CLEARLY A SIGNIFICANT BREACH OF SERVICE. WILL OUP BE MADE TO PAY COMPENSATION FOR THIS PROFESSIONAL INCOMPETENCE?

This week I chose the papers for the Brockington Lab ‘journal club’ here at the Department of Plant Sciences, University of Cambridge (I prefer to call it the ‘weekly research round-up’ though, because good content has nothing-to-do with journals per se!).

We rotate the choice of papers between each lab member every week. Sometimes the focus is betalain or cuticle research, but every 3rd week the focus is on broad-interest research.

The three papers I picked this week are all super-interesting and have a common theme: open research!

1.) Islam et al. (2016). Emergence of wheat blast in Bangladesh was caused by a South American lineage of Magnaporthe oryzae bioRxiv DOI: 10.1101/059832

2.) Erin C McKiernan, Philip E Bourne, Titus Brown, Stuart Buck, Amye Kenall, Jennifer Lin, Damon McDougall, Brian A Nosek, Karthik Ram, Courtney K Soderberg, Jeffrey R Spies, Kaitlin Thaney, Andrew Updegrove, Kara H Woo, & Tal Yarkoni (2016). How open science helps researchers succeed eLife DOI: 10.7554/eLife.16800

3.) Eklund, A., Nichols, T. E., and Knutsson, H. (2016) Cluster failure: Why fMRI inferences for spatial extent have inflated false-positive rates. PNAS DOI: 10.1073/pnas.1602413113

 

Paper 1

Paper 1 by Islam et al. is what we spent the most time discussing. The Open Wheat Blast project came to my attention a few months ago via Nature News. It’s really good to see such a globally-involved multi-author collaboration where all the authors have ORCIDs, posting a preprint before the journal submission AND making all the data openly available as it happens. I won’t say too much but we did have some questions over the science of the paper — was it really necessary to do full scale transcriptomics/genomics to identify the possible origin of the pathogen? We’re not experts in plant disease but could less expensive, more targeted nucleotide sequencing approaches have given the same phylogenetic results?

Paper 1 was also made available online at a preprint server which gave me an excellent opportunity to explain what a preprint server was to the group. I even tried to give an account of possible ‘negatives’ of preprinting: the only one I could think of was embarrassment if the work was demonstrably incorrect or obviously messy/unfinished (but who would actually do that?).

It was also fun to read and analyze a paper in it’s unformatted state. This is what a paper looks like at submission before the imposition of a 2-column layout, journal branding, logos and other crap.

 

Paper 2

screenshot of the 'How open science helps researchers succeed' paper

 

 

 

 

 

 

 

 

 

 

 

 

 

It was my delight to see ‘How open science helps researchers succeed’ get published in eLife the week before the research round-up meeting: perfect timing! We didn’t get much time to discuss it but I hope our group read it. It’s a really solid review of how open research practices can help the individuals doing open science, not just ‘sacrificially’ helping others, as people sometimes tend to cynically interpret it. I was tempted to also suggest the recent opinion paper ‘How publishing in open access journals threatens science and what we can do about it‘ but it’s such a poor quality paper with so many glaring factual errors (there’s an excellent post-publication review on Publons) I didn’t even bother to send it round the group.

Paper 3

Again we didn’t get time to discuss this in detail but I was really pleased that Caroline our visiting undergraduate from Oberlin College had read about this one even prior to me selecting it for our weekly round-up. It’s had a heck of a lot of media coverage (deservedly!), and we hope to talk about it in depth at one of the next OpenCon Cambridge meetups — there’s been some useful discussion of it over on the OpenConCam mailing list. You might think it weird to suggest a neuroscience fMRI paper in a plant sciences group – but the relevance isn’t about the study system. It’s the fundamental need for data archiving, and statistical rigour that are demonstrably important here and it’s a lesson for all disciplines not just neurosciences.

 

In about seven weeks time it’ll be my turn again to choose the papers. It’ll be hard to top those three papers for awesomeness though! Well done to all the authors for the great work.