Masaryk University Žerotínovo nám. 617/9, 601 77 Brno, Czech Republic T: +420 549 49 1111, E: info@muni.cz, W: www.muni.cz CoARA Action Plan November 2023 2/5 CoARA Action Plan Introduction - Evaluation of Research in the Czech Republic and at Masaryk University Masaryk University actively participated in developing the Agreement for Reforming Research Assessment and signed the Coalition for Advancing Research Assessment (CoARA) in 2022. Responsible evaluation is an important institutional value for Masaryk University. Historically, research evaluation in the Czech Republic has chiefly been based on a publication indicator, which resulted in publication points and their conversion into money. Since 2017, research in the Czech Republic has been evaluated at the national level according to a methodology that includes outputs as well as inputs, the environment, and strategies. This evaluation method combines quantitative and qualitative approaches. The role of this methodology is firstly to monitor R&D performance annually and secondly to have research organisations undergo a robust evaluation by evaluation panels over a five-year cycle. However, neither the procedures nor the results of the national methodology can be easily applied within institutions, as it has a different mission, tools, and level of detail. In 2022, Masaryk University implemented an evaluation according to its own design (Internal Research and Doctoral Studies Evaluation – IRDE) for the first time as one of two universities in the Czech Republic. The development of IRDE took place before and during the creation of the ARRA (Agreement on Reforming Research Assessment) document; however, the concept of responsible evaluation has long been a central value for Masaryk University. IRDE is based on the principles of the Standard Evaluation Protocol, the Research Excellence Framework experience, and adheres to the principles of DORA and the Leiden Manifesto. In the final phase of preparation, we also monitored the progress of the ARRA preparations and checked whether the evaluation was in line with the ARRA commitments. Although we prepared the evaluation before the SCOPE framework was established, we also checked compliance with this initiative. As a result, we can say that the MU research evaluation system is largely in line with both the CoARA commitments and other initiatives, in particular, SCOPE. Evaluation will take place in five-year cycles. MU faculties and institutes participated in the design process from the beginning. Evaluation is based on the self-evaluation of individual faculties through interviews with evaluation panels during site visits. Self-evaluation reports are predominantly narrative in nature and differentiated at the level of units and doctoral degree programmes. Faculties and institutes have had the opportunity to adapt the structure of the report to their field-specific needs. The content of these reports mainly comprises the mission of the unit, the most important research results, social impact case studies, strategies, and environmental data. The supporting bibliometric reports only aimed to provide a panoramic view of the evaluated unit’s publication activity, with no intention of serving as a primary evaluation basis. Nor did the format of the self-evaluation report allow for the substitution of bibliometrics or basing the evaluation of the unit on bibliometrics. The purpose of the evaluation is to provide critical feedback for the developments of the faculty, institute, and unit. The evaluation does not have a direct impact on central funding (the level of funding is not derived from the results of the evaluation). However, faculties can use IRDE to develop research strategies, which are a condition for the provision of the contractual part of the research budget. The implementation of the first run was followed by a phase of evaluating and communicating IRDE outputs. We consulted with the creators of the SCOPE protocol (INORMS Research Evaluation Group), organised a conference dedicated to responsible evaluation (Science for Society), and are now preparing a summary report that will critically evaluate the whole IRDE process. The report includes the results of a questionnaire survey among evaluators and members of the MU academic community, and suggestions for future improvements. Evaluation activities at Masaryk University have three main purposes, each of which has its own processes and tools: research evaluation (Internal Research and Doctoral Studies Evaluation), funding, and monitoring (bibliometrics). The logic of this system is that, even though these purposes 3/5 CoARA Action Plan are complementary, we use the right tool for each purpose. Research evaluation is separate from funding and is not affected by bibliometrics, which are used to track trends and create profiles for the purposes of annual system analysis and R&D monitoring. We use a set of performance indicators for funding, but they have only a partial weight in the budget, and we transparently communicate their exclusive role for the distribution of part of the funding. In doing so, we aim to create an environment in which we reduce undesirable research incentives. Research evaluation has a clear objective – to provide robust and valid feedback that is unencumbered by bibliometric parameters and financial incentives. CoARA came at a time when we were gradually introducing tools to complete this triad of evaluation activities, keeping in mind with every step the concept of responsible evaluation and using existing initiatives and case studies (DORA, Leiden Manifesto, SEP, REF, and others). As a result, Masaryk University is already fulfilling to a large extent some of the CoARA commitments: — Commitment 1, 2: Internal Research and Doctoral Studies Evaluation is based on peer-review without the influence of bibliometrics and is designed to recognise the diversity of disciplines (faculties have the opportunity to customise self-evaluation reports and include all relevant information). — Commitment 3: We do not use metrics for inappropriate purposes. Bibliometrics (including IF and quartile ranking) are used to monitor publication trends and publication strategy and, at least at the central level, do not represent the concept of quality or directly influence the evaluation of research or individuals. We use the metrics to allocate a small portion of the research budget. — Commitment 4: University rankings in no way affect the evaluation or funding of research at MU. The rankings are only monitored and, where appropriate, used to promote the university externally, i.e. for the purpose for which they predominantly serve. — Commitment 5: Implementation of CoARA commitments forms part of the work of the Centre for Scientometric Support and Evaluation (CSSE), which is the central unit dedicated to research evaluation, bibliometrics, and teaching activities. However, the same cannot always be ensured from the central level at lower levels, i.e. in the organisational components and units. At the same time, we are aware that reform can only be achieved when there is change at smaller levels. During implementation of the reform, we will also pursue the following key objectives: — Efforts to simplify the evaluation system (evaluate only when necessary and remove possible duplications). — Respect field-specific expectations (unforced and context-sensitive implementation). The Action Plan contains a set of objectives and actions which are divided into three main areas corresponding to the individual CoARA commitments. 4/5 CoARA Action Plan CoARA Commitment Implementation Plan 1 Analysis of evaluation activities and their impacts Related CoARA commitments: — Commitment 6: Review and develop research assessment criteria, tools, and processes. — Commitment 10: Evaluate practices, criteria, and tools based on solid evidence and the state-ofthe-art in research on research and make data openly available for evidence. Objectives 1.1. To develop an overview of evaluation activities at MU in research and for individuals (individual evaluation and the career progression of academics). 1.2. To develop an overview of MU’s evaluation processes in research and for individuals (individual evaluation and academic career progression), distinguishing the main methods and the course of these processes across faculties/institutes in order to identify which of them meet or do not meet CoARA commitments. — The mapping of evaluation processes will take place in the form of personal meetings at preselected locations and in the form of a questionnaire sent to faculties (vice-deans, the research office) and RMU (MU Rector’s Office) divisions. — To identify situations where bibliometrics/metrics have a direct impact on the evaluation of individuals’ research and careers, and where they may lead to a narrowing of the range of criteria relevant to the evaluation of research work. 1.3. To identify units and committees at MU that deal with topics related to research evaluation and to establish mutual cooperation. — We will include the administration of personnel policy (Personnel Management Office – EVAK), scientific integrity (Ethics Board), bibliometrics (libraries and research departments of faculties), and open science (Institute of Computer Science), etc. in the analysis. — A review of directives that define the scope and roles of scientific boards and committees. 1.4. To compile and publish a summary report on IRDE (“evaluation of the evaluation”). — To evaluate the process, objectives, and impacts of Internal Research and Doctoral Studies Evaluation (IRDE) against CoARA commitments, to identify potential problematic situations and processes. — Publicly publish the IRDE summary report on the MU website. 1.5. In collaboration with the faculties, to analyse field-specific quality expectations and criteria relevant to the evaluation of research and individuals. — A long-term goal. Criteria and expectations include, for example, publication patterns, expectations for the level of research activity among academics in the field, the nature of the research activity, the resources and funding required, typical outputs, application potential, communication of science, open science, etc. — Cooperation will take place in the form of meetings between MU management and faculties as part of annual evaluation interviews, in final IRDE meetings, in consultations with vicedeans (also Objective 3.7), and in consultations and analyses involving the Centre for Scientometric Support and Evaluation (a specialist unit at the Research Office at the MU Rector’s Office, hereinafter referred to as the “CSSE”). 5/5 CoARA Action Plan 2 Responsible evaluation, research culture, and motivating environment Related CoARA commitments: — Commitment 1: Recognise the diversity of contributions to, and careers in, research in accordance with the needs and nature of the research. — Commitment 2: Base research assessment primarily on qualitative evaluation for which peer review is central, supported by the responsible use of quantitative indicators. — Commitment 3: Abandon inappropriate uses of journal- and publication-based metrics in research assessment, in particular, inappropriate uses of Journal Impact Factor (JIF) and h-index. — Commitment 4: Avoid the use of rankings of research organisations in research assessment. Objectives: 2.1. Based on the conducted analysis (Objective 1.2), to identify evaluation processes at MU that do not comply with CoARA commitments and update the action plan with specific objectives tailored to the identified gaps and situations. — At the MU-wide level, to remove metrics from processes and documents where their presence could lead to a narrowing of the range of observed criteria for evaluating research work. 2.2. Based on the conducted analysis (Objective 1.4), to align the conditions, criteria, and processes of the 2027 IRDE with CoARA commitments and the principles of responsible evaluation in general, in line with Objective 2.4. 2.3. To develop an updated version of the MU policy for good practice in scientific publishing (including work with bibliometrics) as a prerequisite for applying the principles of responsible evaluation. — Formulation of MU’s values and research priorities against which IRDE will be conducted in 2027. 2.4. To develop a draft MU policy for science evaluation (“Responsible research evaluation at MU”). — Once approved by MU management, it will serve as a central policy containing the basic principles for all evaluation activities at MU. 2.5. To introduce a contractual component within the internal system for distributing the core budget for research (governmental support for research organizations in the Czech Republic) among MU faculties and institutes, which will have a stabilising character and ensure that the influence of metrics on faculty/institute budgets (performance component) is maintained at a reasonable level. 2.6. To promote the consideration of field-specific needs towards inclusivity and diversity in research evaluation. — To initiate face-to-face meetings with contact persons responsible for setting up assessment in doctoral studies, in qualification procedures, and in the evaluation of academic staff (EVAK) (derived from Objective 1.3.), and support changes to reflect field-specific needs. — To consider career length, doctoral degree programme duration, and the mission and focus of individuals and departments in research evaluation. This will be linked to an analysis of publication patterns of academics at different stages of their careers at MU. — Design research assessment criteria to value all academic activities in the areas of outcomes/outputs, research processes, teaching, and impact, depending on the field. 2.7. To maintain IRDE as the central method of research evaluation at the MU level. 2.8. To maintain the current use of bibliometrics exclusively for strategic, monitoring, and support purposes. 2.9. To maintain the current zero influence of international university rankings on the evaluation of research and follow-up funding at MU. 6/5 CoARA Action Plan 3 Platform for CoARA implementation and experience sharing Related CoARA commitments: — Commitment 5: Commit resources to reforming research assessment as are needed to achieve the organisational changes committed to ensure that organisations allocate the necessary resources. — Commitment 7: Raise awareness of research assessment reform and provide transparent communication, guidance, and training on assessment criteria and processes as well as their use. — Commitment 8: Exchange practices and experiences to enable mutual learning within and beyond the Coalition. — Commitment 9: Communicate progress made on adherence to the Principles and implementation of the Commitments. Objectives: 3.1. To allocate capacity for institutional reform – to retain this task within the work of the RMU CSSE. 3.2. To allocate capacity to support the national environment and share experiences (especially CZARMA – the Czech Association of Research Managers and Administrators). — To organise a national working group within CZARMA to share experiences of preparing action plans and implementing CoARA commitments among research organisations in the Czech Republic. 3.3. Communication of evaluation processes within MU through simple and clear infographics to enhance the transparency, informational awareness, and involvement of the wider academic community. — Areas: responsible evaluation at MU and the implementation of CoARA, research funding at MU, IRDE process. — Channels: HR newsletter, MU newsletter, vice deans’ meetings, faculty flyers, poster. 3.4. To establish a training activity at MU dedicated to research evaluation. — Video lecture FRESHERS, HR4MU e-learning. 3.5. To participate in international networks exploring research evaluation systems, bibliometrics, and scholarly communication (research on research). 3.6. To update the website hodnocenivedy.muni.cz with information about CoARA and establish an English version. — The website will become the central information point for research evaluation reform. 3.7. To increase the level of collaboration between RMU (CSSE) and other RMU departments and faculties. — Quality Office, Strategy Office, Personnel Management Office, Office for Academic Qualifications, Internal Evaluation Board, Ethics Board, CERPEK, External Relations and Marketing Office. — Involvement of faculty stakeholders (vice-deans for research, R&D administration) in actively shaping the content of the reform. — To report regularly on the progress of the implementation of CoARA commitments at the forum of vice-deans for research.