Skip to main content

Welcome Physical Review Research

This post is addressed to fellow physics researchers:

I very much welcome the new open access journal Physical Review Research by the American Physical Society. It is a step in the right direction. I have great hope in the APS keeping its leadership in physics publishing in a way that journals serve the academic community and not the other way around. PRR aims to serve the whole physics community, subfields being identified by searchable tags. Ideal next steps to my mind:

(i) Gradually subsume the Physical Review journals into PRR (easier said than done, I know, especially moneywise).

(ii) Analogously to the tags identifying subfield, tags should also reflect “importance and broad interest” as now done by the categories of regular articles, rapid communications, and Physical Review Letters (or Physical Review X). A numerical tag would suffice: 1, 2 and 3 for the three mentioned categories, for instance. One could even go for a level 4, indicating the level of papers that would go into highly-selective all-sciences journals.

Present journal names provide information about “importance” of papers, just metadata (metaliterature) after all; why do not treat it as such?

Actually, these “importance” categories lose their relevance with time: if you are reading an article published a few years back you do not care so much about where it was published. Having this “importance” classification as a mere tag (not visible when citing the paper, and which could even disappear in due time), instead of keeping it enshrined in a flashy journal name, much better reflects its value and purpose, providing the needed discrimination of recent literature, and alleviating the IFFS epidemic (impact-factor-fascination syndrome). Authors could also opt out of grading if they can address their community effectively by other means.

(iii) This is daring: allow for a level 0, that is, papers that are decent physics papers, but do not reach the level of today’s Physical Review journals and are now sent to “more specialised journals” (the impact factor would drop, how terrible!). I do not think it makes sense to reject papers anymore, except for the ones that should not be published at all.

And finally, (iv) coordinate with other communities and learned societies towards a connected main body of literature. A new paradigm is emerging, and a new business model with it. My thinking is explained in more detail here.

Comments

Popular posts from this blog

The spirit of the San Francisco DORA in scientific meetings

The San Francisco Declaration on Research Assessment (DORA) advocates for avoiding the use of journal impact factors in evaluations of scientists and their contributions. I very much agree with the idea, and so do the many signatories of the declaration, both personal and institutional. Impact-factor fascination syndrome (IFFS, the very thing DORA wants to counter) is however spreading and thriving in the research community. I would propose to extend the spirit of DORA to scientific meetings: Speakers in adhering research meetings should avoid quoting journal names in what they show . Nowadays, the names of one or two authors and the year should suffice to find any paper, if there is no arXiv reference for instance. It sounds sensible that when speakers describe their work, they show the reference of where to find the relevant publication. But we all know that showing references to high-impact-factor journals is used to impress the audience (not to mention journal covers), and I can

Research antiefficiency principle

It is an intriguing thought (and admittedly a provocative way to put it). University funding in the USA, UK and other research-powerful countries is partly based on grant overheads. That is, funds payed by research agencies to the research institutions, beyond the direct costs of the research itself, to cover for a proportional part of the costs of running the institutions themselves. Sensible. Overheads do represent a significant part of institutional income. It makes sense in many ways, but, in essence, it leads to the antiefficiency principle: Since overheads scale with the direct research costs, universities and research institutions, more or less directly, tell their research staff: "do your best, for as much money as possible". The title of this post is provocative because the quoted statement above is not as antiefficient as it sounds. "Do something for as much money as possible" would be antiefficient, but the actual statement implies two maxmisations, &