An assessment system that predominantly evaluates research performance based on journal output and citations is steering academics from developing countries like mine to chasing quantity over quality. And being exploited while doing so.
Researchers in Indonesia are the second most likely in the world to publish in dubious journals that print articles for a fee without proper scientific peer review, a process where several experts in the field review the merit of the research, according to a new study by economists Vit Machacek and Martin Srholec.
These predatory journals prey on academics whose career progressions, and therefore salary increase, are determined by credit points. They exploit the processing fees that authors pay to make articles open to the public. They pocket the payment, an average of $178, an amount close to the basic salary of an entry-level lecturer in a state university in Indonesia, without facilitating proper peer review. The papers published by predatory journals are often low-quality, with typographical and grammatical errors.
I run a nonprofit online media that works with scholars to produce evidence-based analyses that laypeople can easily digest. This work that helps spread knowledge and builds an informed public earns academics very little credit points in Indonesia. But publishing in journals indexed by international academic databases gives them plenty.
Unfortunately, hundreds of potentially predatory journals have infiltrated academic databases, such as Scopus. Machacek and Srholec found that potentially predatory journals that appeared in the database had published more than 160,000 articles between 2015 and 2017. Their analysis shows that around 17% of articles, or every sixth article, produced by researchers in Indonesia and Kazakhstan are published in predatory journals.
Sociologists Anna Severin and Nicola Low have warned that having these low-quality studies in academic databases may spread untrustworthy research into the scientific literature. Although, an analysis by researchers in Finland says that articles in predatory journals are rarely cited by other academics, meaning they do not matter much.
What is clear is that it is a waste of resources. The predatory journal market was estimated to be around $74 million in 2014. And academics could have diverted the time they took to do substandard work for the real hard work of quality research. For many scholars, this would include improving their research and communication skills. And this is important, especially for developing countries that need well-trained researchers to build and strengthen their research sector.
Academics in advanced economies, such as in the U.S. and some European countries, also fall prey to predatory journals. But, Machacek and Srholec’s analysis found academics in medium-level economies with large emerging research sectors are the most susceptible. In addition to Indonesia and Kazakhstan, India, Nigeria, the Philippines and Egypt are in the top twenty.
In Indonesia’s case, government policies in recent years that geared the assessment for promotion to push academics to publish has succeeded in increasing the number of papers published by Indonesian scholars. Data from Scimago Country & Journal Rank shows that within five years between 2015 and 2019, Indonesia increased its output by more than 400%, from around 8,000 to 44,000.
There is a way to stop this. In the past decade, there has been a movement to change the way research is being evaluated. Around the world, governments, science managers, research funders and universities base decisions to hire and promote, grant funds to rank universities using scientometry, a method that ranks journals and measures academics’ productivity and impact based on the number of publications and citations.
Scholars argue this journal-based metrics is not an accurate measure of scientific quality. In addition to the predatory journal problem, the metric also discourages science collaboration. As the metric values article count, academics who want to turn out several journal articles from a data set has an incentive to hold on to them rather than sharing them for other scientists to analyze.
In 2012, a group of editors and publishers met during the annual meeting of The American Society for Cell Biology and released a declaration on research assessment (DORA). Their general recommendation is to stop using journal-based metrics as a surrogate measure to evaluate the quality of research and individual scientists’ contribution.
They also recommend recognizing the value to all scholarly output, from journal articles, preprints which means articles uploaded in an open-access platform that have yet been peer reviewed, data sets, software, protocols, research materials, well-trained researches to societal outcomes and policy changes.
COVID-19 pandemic shows that speed and collaboration are essential in finding solutions. Reputable journals such as Science and Nature sped up their peer-review process. And many researchers shared their data sets and are uploading their findings in open science preprint platforms before submitting them to peer-reviewed journals. The spirit here is not about scoring points but working together to solve a global problem.
The Indonesian government should study and follow the DORA recommendation. By moving away from pushing academics to chase journal-based scores and creating a meaningful way to evaluate research, Indonesia will have a better chance of genuinely building and strengthening its research sector and take an active part in advancing science and providing solutions.
Prodita Sabarini is executive editor of The Conversation Indonesia, a nonprofit online media that brings together academics and journalists to produce evidence-based journalism. She is a 2019 Asia Pacific Obama Foundation Leader.
Published in Nikkei Asia on March 12, 2021