Opportunities for international researchers in Russia

AI-inspired integrity implications in grants review: Russian Science Foundation experience

June 28, 2022
Print

With enormous public research money, funders make sure these funds are used ethically. By development and deployment of computer-assisted review processes, Russian Science Foundation intends to overcome the existing human-attributed bias challenges and offer measures that may help improve review integrity nationwide.

This is an epic story about how a very small team of the Russian Science Foundation - less than 50 people - managed last year to ensure 115,300 targeted review invitations with more than 58,000 reviews completed. Without the help of algorithms, performing such work would be impossible. Over the past five years, more than 350,000 invitations to review were released for project evaluation. Can you imagine what a colossal amount of work it is for 9 panel chairs of the Foundation's scientific council, who traditionally manually chose the most suitable reviewers for each application and scientific report?

The task of improving the review mechanisms through automation has been systematically solved since 2014, when we introduced assistance in the selection and ranking of the necessary reviewer according to the code of the field of science. Every year we developed the system, adding new components and functions to it, such as recognition of various types of conflicts of interest or preferred workload of reviewer, recognition of reviewers past rejections to review, preferred keywords. Finally, in 2019 we tried to almost completely automate the process of appointing experts, conducted a comparative analysis of computer and manual appointments, and - most importantly - openly discussed these results with the research community in Russia.

Almost 1300 proposals were submitted for each of these calls, each proposal required the assignment of three reviewers.

Parameters

Manual

Automatic

Number of assignments

2512 reviewers

2524 reviewers

Average time to accept review

2,3 days

1,6 days

Rejection rate

31,3%

37,5%

Average time to complete review

13,1 days

13,1 days

% of reviews completed within a month

63,3 %

72,9%

The computer distribution appeared advantageous for average time to accept review invitation, provided higher rates of completed examinations during the month, and also helped attract more diverse specialists from internal database. As for other qualitative characteristics, the distribution of evaluation scores in both samples turned out to be similar. This means that the innovation in the form of computer distribution did not affect the quality of review. At the same time, the polarity of assessments somewhat decreased.

It was expected that due to the insufficiently accurate indication of discipline codes and keywords in reviewers profiles, the number of rejected assignments would increase because the applications got inappropriate reviewers. This has not happened.

With AI-assistance, rejections increased a bit but rejections due to conflict of interest and unavailability reasons, on the contrary, decreased. The computer does not know the competence of reviewers as much as the panel chair does, but it is more efficient in thorough checks of affiliations. Effectively, algorythms check if the reviewer is an applicant in the same call, if the reviewer is not employed in the same organization, if the reviewer already accepted many proposals, etc. But AI is not yet capable to detect family ties and complex relations between different groups of researchers in the same field.

The first results of AI-deployment are promising. The percentage of appeals to review submitted by applicants felt from 0.32% to 0.28% in computer-assisted trial. This evidence suggests AI use in peer review may help improve the process, boost the quality of reviews, ensure a better integrity and save time of the panel members considerably.

Based on the big data of past evaluations, we have further digitalized reviewers portraits. Since 2021, RSF shares with members of the expert council not only evaluations themselves, but also the behavioral profile of each reviewer. This solves the problem of predominantly “negative” and “positive” reviewers, who, due to various subjective reasons, tend to one or another polarity in the assessment. Thus, if an application was assigned to a predominantly “positive” reviewer, our new algorithms will try to find an additional reviewer who is more likely to be critical. Thus, we try as much as possible to move away from the risk of subjectivity and the notorious "human factor" when evaluating grants in the direction of greater honesty, fairness, impartiality and objectivity. We believe the world needs these good things.

In the future, RSF also plans to integrate semantic analysis and machine learning into our review system. The most important task for all times is the further expansion of the reviewer database, refining, updating and clarifying data and keywords in reviewer profiles, standardization of the interests of reviewers and keywords, development of algorithms for analyzing the work of reviewers (tracking preferences in people, organizations, analyzing their publications, etc.) , the creation of algorithms for detecting "anomalies" (for example, outliers of scores or a very fast examination). We also should continue an open, honest and trusting dialogue with the research community, whose activities these algorithms directly concern.

This study was presented and discussed at the world level at the 7th World Conference on Research Integrity in Cape Town, South Africa on 31 May 2022. WCRI is the most significant event on the international research integrity calendar. It fosters the exchange of information and harmonization of efforts to boost good research practices through discussion the ethical challenges such as authorship, plagiarism, non-peer-reviewed research, duplicate submissions, image alteration, paper mills, predatory journals etc. Research integrity is increasingly seen as a driver of research excellence and public trust so that society can trust the outcomes of research and researchers can trust each other to build upon existing research.

Share this article

Poll conducted

  1. In your opinion, what are the US long-term goals for Russia?
    U.S. wants to establish partnership relations with Russia on condition that it meets the U.S. requirements  
     33 (31%)
    U.S. wants to deter Russia’s military and political activity  
     30 (28%)
    U.S. wants to dissolve Russia  
     24 (22%)
    U.S. wants to establish alliance relations with Russia under the US conditions to rival China  
     21 (19%)
For business
For researchers
For students