Which coun­tries are pulling away from the pack on re­search pro­duc­tiv­ity?

Anal­y­sis of rank­ings data sug­gests Aus­tralia has boosted quan­tity and qual­ity. Si­mon Baker writes

THE (Times Higher Education) - - CONTENTS - Si­[email protected]­ere­d­u­ca­tion.com

“Pub­lish or per­ish” is a mantra that is ei­ther re­spon­si­ble for many of the per­ceived modern ills of academia or is a pos­i­tive driv­ing force, de­pend­ing on your point of view.

But among the ma­jor re­search na­tions, where has a push for aca­demics to pub­lish more been felt the most in re­cent years? And has there been any no­tice­able con­trast with changes in qual­ity?

One of the 13 met­rics be­hind Times Higher Ed­u­ca­tion’s World Univer­sity Rank­ings at­tempts to di­rectly mea­sure the pro­duc­tiv­ity of re­searchers in each in­sti­tu­tion, and ex­am­in­ing those data at the na­tional level from the past few years gives some in­ter­est­ing re­sults.

For in­stance, the data sug­gest that Aus­tralia’s uni­ver­si­ties have seen some of the big­gest pro­duc­tiv­ity in­creases rel­a­tive to other na­tions. Of coun­tries with at least 10 uni­ver­si­ties in the 2016 and 2019 edi­tions of the rank­ing, Aus- tralia had one of the largest leaps in the av­er­age score for pa­pers per staff and is now sec­ond only to the Nether­lands on the met­ric.

Ac­count­ing for the fact that the Nether­lands’ uni­ver­si­ties are al­most all in the top 200 pro­vides even bet­ter news for Aus­tralia: its top 10 uni­ver­si­ties now achieve the best score for av­er­age pa­pers per staff among lead­ing re­search na­tions, over­tak­ing the UK as well as the Nether­lands since 2016.

Look­ing at the fig­ures in the con­text of over­all re­search out­put also sug­gests that in some coun­tries, such as China, a rapid in­crease in re­search pub­li­ca­tion has not yet been ac­com­pa­nied by large pro­duc­tiv­ity gains.

So are na­tional poli­cies be­hind some of these trends?

In Aus­tralia, there have been clear pol­icy in­cen­tives in the past decade to boost pro­duc­tiv­ity. The most ob­vi­ous is that un­til 2017, block grant fund­ing to sup­port re­search in Aus­tralian uni­ver­si­ties was de­ter­mined in part by the amount of re­search pub­lished.

How­ever, a re­view pub­lished in 2016 led to this el­e­ment of the fund­ing cal­cu­la­tion be­ing re­moved and – along­side the evolv­ing Ex­cel­lence in Re­search for Aus­tralia as­sess­ment – there now ap­pears to be a drive di­rected more to­wards qual­ity than quan­tity.

“Ty­ing funds ini­tially to re­search in­come and to pub­li­ca­tions while largely hold­ing the fund­ing steady put uni­ver­si­ties in the po­si­tion of

hav­ing to im­prove to main­tain fund­ing lev­els – or risk other uni­ver­si­ties do­ing bet­ter and at­tract­ing a higher pro­por­tion of fund­ing,” said Conor King, ex­ec­u­tive direc­tor of In­no­va­tive Re­search Uni­ver­si­ties, which high­lighted Aus­tralia’s pro­duc­tiv­ity surge in a re­cent sub­mis­sion to a par­lia­men­tary in­quiry on re­search fund­ing.

“The pub­li­ca­tion fac­tor was the eas­i­est for aca­demics to in­flu­ence and [it] quickly rose – hence it has now been re­moved from the fund­ing for­mula, its pur­pose achieved.”

Mr King added that the ERA’s fo­cus on re­search qual­ity had now “helped bal­ance sheer out­put with con­sid­er­a­tion for its value”, but a cur­rent squeeze on block fund­ing raised the ques­tion of whether in­creases in re­search out­put would now stall.

“It is a live ques­tion how much the gov­ern­ment can squeeze the base re­sources [needed] to em­ploy re­searchers… while look­ing for greater out­put, and in par­tic­u­lar tar­get­ing all new funds to spe­cific projects, ex­pect­ing the base univer­sity ca­pa­bil­ity to pro­vide half or more of the ac­tual ex­pen­di­ture re­quired.”

To judge by rank­ings data on the ci­ta­tion im­pact of re­search, it is qual­ity where Aus­tralia still has a lit­tle ground to make up on other na­tions rather than pro­duc­tiv­ity.

How­ever, its push to in­crease re­search vol­ume does not ap­pear to have done ci­ta­tion im­pact any harm, whereas in other coun­tries such as France there do not ap­pear to have been so many qual­ity gains as pro­duc­tiv­ity has in­creased.

And in some na­tions – most no­tably Rus­sia – qual­ity ap­pears to have de­clined on av­er­age as pro­duc­tiv­ity has in­creased (although the ef­fect of the rank­ings ex­pand­ing from 2016 to 2019 may be a fac­tor here and in some other coun­tries).

More means bet­ter

By and large, how­ever, in the most de­vel­oped re­search na­tions, pro­duc­tiv­ity gains ap­pear to go hand in hand with im­prove­ments in ci­ta­tion im­pact. So does this mean that na­tional poli­cies and eval­u­a­tions such as the ERA or the re­search ex­cel­lence frame­work in the UK are

be­com­ing bet­ter at in­flu­enc­ing both?

Sergey Kolesnikov, a post­doc­toral re­searcher at Ari­zona State Univer­sity’s Cen­ter for Or­ga­ni­za­tion Re­search and De­sign – who has co-au­thored re­search on the re­la­tion­ship be­tween pro­duc­tiv­ity and re­search im­pact – said that in his view, the bet­ter as­sess­ment pro­grammes sought not to con­cen­trate too much on one over the other.

“I think that ex­ces­sive fo­cus ei­ther on pro­duc­tiv­ity or qual­ity is equally harm­ful, es­pe­cially if the eval­u­a­tion sys­tem is based on a small num­ber of sim­plis­tic quan­ti­ta­tive in­di­ca­tors, be­cause any in­di­ca­tor is just a poor proxy for a real-world com­plex phe­nom­e­non it mea­sures,” he said.

While the prob­lems of mea­sur­ing pro­duc­tiv­ity “were well known”, bas­ing de­ci­sions on “sim­ple mea­sures” of qual­ity “such as jour­nal rank­ings or jour­nal im­pact fac­tors have all sorts of neg­a­tive im­pacts, too”.

“So, the move­ment in con­tem­po­rary eval­u­a­tion sys­tems that recog­nises these prob­lems is not just a shift of em­pha­sis from pro­duc­tiv­ity to qual­ity, but rather a move away from sim­plis­tic met­rics of one or the other to­wards more sys­tem­atic as­sess­ment that com­bines var­i­ous con­text-based quan­ti­ta­tive mea­sures with qual­i­ta­tive as­sess­ment and peer re­view.”

He said that the evo­lu­tion of pol­icy in Aus­tralia over the past decade might be an ex­am­ple of this, and he also high­lighted re­cent up­dates to the Well­come Trust’s open-ac­cess pol­icy that em­pha­sised as­sess­ing re­search on the “in­trin­sic merit of the work, not the ti­tle of the jour­nal or pub­lisher”.

This “strong push to­wards more re­spon­si­ble re­search eval­u­a­tion prac­tices within higher ed­u­ca­tion in­sti­tu­tions” was also hope­fully “a sign of fu­ture changes on a na­tion­wide level” too, Dr Kolesnikov said.

So the hope among re­searchers them­selves might be that fu­ture gains in both the pro­duc­tiv­ity and the qual­ity of re­search will be a byprod­uct of so­phis­ti­cated ap­proaches to re­search as­sess­ment rather than di­rect at­tempts to in­flu­ence them.

Newspapers in English

Newspapers from UK

© PressReader. All rights reserved.