Skip to main content

Research antiefficiency principle

It is an intriguing thought (and admittedly a provocative way to put it). University funding in the USA, UK and other research-powerful countries is partly based on grant overheads. That is, funds payed by research agencies to the research institutions, beyond the direct costs of the research itself, to cover for a proportional part of the costs of running the institutions themselves. Sensible.

Overheads do represent a significant part of institutional income. It makes sense in many ways, but, in essence, it leads to the antiefficiency principle: Since overheads scale with the direct research costs, universities and research institutions, more or less directly, tell their research staff: "do your best, for as much money as possible".

The title of this post is provocative because the quoted statement above is not as antiefficient as it sounds. "Do something for as much money as possible" would be antiefficient, but the actual statement implies two maxmisations, "your best" and "as much money as possible", that is, maximise both input and output.

Here is the tricky thing, the efficiency of the system depends on how much weight we put on either maximisation, which depends on various subtle mechanisms, very much in the research culture of the different countries. A balanced system can be well tuned, in the sense that output quality defines the likelihood of new input (with provision for entering the wheel as starting researcher). In the UK the balance is (at least partly) kept by the fact that, complementing the overheads, universities are also funded by direct evaluation of their output in a national research assessment exercise (now REF). The system/culture also depends on time: in some particular country it may have worked well at some point, but then becoming more antiefficient with time.

The fact is that many institutions evaluate researchers by input, using grant income as the key component of their evaluation. That is certainly an aberration. The thinking is: if the researcher has secured the grant, somebody will have evaluated the researcher, and somebody will evaluate the output of the project. It implies that the institution delegates away the assessment of quality of the output. It has now even become common that the input is perceived as the goal in many processes and evaluations. Capacity for raising funds is the wanted trait, whatever done with the direct funds (once the overheads are secured) becomes a secondary consideration.

Some research institutions in Europe have become specialised in attracting European grants (a sport demanding very specialised skills) while not so skilled for the execution of the corresponding projects. I have witnessed situations in which such a research institution gets the grant and, after securing the overheads, they subcontract the actual research work to a private company. Probably an extreme and quite minoritary situation, but, nevertheless, illustrative.

It is the downside of an otherwise sensible model that has worked well in some places for decades. We researchers just need to be aware and help keeping a healthy balance in our own contribution to evaluations, while reminding institutions about the antiefficiency principle.

I have a naughty proposal in this context.  To the many bibliometric indices currently used to evaluate research, we could add the index of citations per dollar. Technically hard to measure (not so hard for large averages, such as per country). As any other index, it would need to be wisely used in conjunction with other metrics, and always allowing for differences among fields of research, etc. It is nevertheless intriguing what the landscape would look like when using it. After all, we owe it to the tax payer.

Comments

Popular posts from this blog

The European Union and fundamental research

The European Commission (EC) has sought feedback from the research community base in the form of a survey for its Horizon Europe research plan. It can be found in   https://ec.europa.eu/eusurvey/runner/659c5eea-5f1d-341b-482e-92b53222f619   The overall strategy is already decided, in terms of basic themes etc, and the questions in the survey are already quite specific. Here is a general comment on EU research as funded by the EC. It relates to fundamental research as opposed to directed or applied.   The framework programs (as all EC endeavors) are meant to complement national efforts and not duplicate them (as a fundamental mandate of relevant treaties). Blue-skies fundamental research is left for national programs, and  is therefore not EC funded, except for the Marie Curie program with the training and mobility argument, and the European Research Council (ERC) with the excellence argument. The latter is essentially an award progra

Welcome Physical Review Research

This post is addressed to fellow physics researchers: I very much welcome the new open access journal Physical Review Research by the American Physical Society. It is a step in the right direction. I have great hope in the APS keeping its leadership in physics publishing in a way that journals serve the academic community and not the other way around. PRR aims to serve the whole physics community, subfields being identified by searchable tags. Ideal next steps to my mind: (i) Gradually subsume the Physical Review journals into PRR (easier said than done, I know, especially moneywise). (ii) Analogously to the tags identifying subfield, tags should also reflect “importance and broad interest” as now done by the categories of regular articles, rapid communications, and Physical Review Letters (or Physical Review X). A numerical tag would suffice: 1, 2 and 3 for the three mentioned categories, for instance. One could even go for a level 4, indicating the level of papers that would go