Want to create interactive content? It’s easy in Genially!

Get started free

RethinkACA- v3

WG ACA

Created on August 4, 2025

Start designing with a free template

Discover more than 1500 professional designs like these:

Project Roadmap Timeline

Step-by-Step Timeline: How to Develop an Idea

Artificial Intelligence History Timeline

Museum Escape Room

Momentum: Onboarding Presentation

Urban Illustrated Presentation

3D Corporate Reporting

Transcript

RethinkACA: Tools for academic career assessment reform

Consultation / testing phase

start

Introduction

This framework was developed by the CoARA Working Group on Reforming Academic Career Assessment to help institutions navigate and implement academic career assessment reform.

An interactive, conceptual framework designed to support institutions in reforming academic career assessment (ACA). It provides practical tools, real-world examples, and contextual information to help you build fairer, more inclusive, and more transparent evaluation processes. This is not a prescriptive model. Instead, it is a flexible and user-oriented resource to support institutions at different stages of reform. 👉 Open and accessible — all resources are freely available online. 👉 Tailored — you can explore tools and resources that align with your goals, context and institutional role.

How to use the framework?

The framework is built around three components:

Institutional role (What is your role within the institution? e.g. leadership, HR, assessment board, academic staff)

Institutional goals (What do you want to do? - choose a reform objective e.g improving fairness, recognising diverse career paths)

Tools & Resources (What can help you? – Access practical tools and resources)

Start Anywhere You can begin from any point — by selecting your role, your goal, or by browsing all tools. For each role–goal combination, we highlight recommended tools and resources that are especially relevant.

How to use the framework?

What you'll find for each tool:

Example of a tool

A short explanation of why it matters for your selected role and goal

NOR-CAM – A toolbox for recognition and rewards in academic careers

A brief description of the tool or resource and practical ideas for how it can be implemented

Clicking the tool name will open it in a new browser window.

Additional information

Start

How were the tools selected?

Learn more about the CoARA Working Group on Reforming Academic Career Assessment

Let us know your feedback on this framework!

Where do you want to start?

What do you want to do?

What is your role?

Institutional goals

Institutional role

Tools & Resources

Share your feedback on the framework

What is your role?

* While it is likely that you hold different roles at the same time in your institution, please choose the role that is more aligned with your intended use of this framework.

Leadership and governance

Recruitment and assessment boards

Administration and management

Academic staff (as candidates)

Share your feedback on the framework

What do you want to do?

Align organisational ACA processes with national or international initiatives

Promote transparency and fairness

Diversify and recognise career pathways

Integrate equality, diversity, inclusiveness in ACA processes

Foster engagement of the academic community in ACA reforms

Review assessment criteria

Share your feedback on the framework

What is your role?

* While it is likely that you hold different roles at the same time in your institution, please choose the role that is more aligned with your intended use of this framework.

Leadership and governance

Recruitment and assessment boards

Administration and management

Academic staff (as candidates)

Share your feedback on the framework

What is your role?

* While it is likely that you hold different roles at the same time in your institution, please choose the role that is more aligned with your intended use of this framework.

Leadership and governance

Recruitment and assessment boards

Administration and management

Academic staff (as candidates)

Share your feedback on the framework

What is your role?

* While it is likely that you hold different roles at the same time in your institution, please choose the role that is more aligned with your intended use of this framework.

Leadership and governance

Recruitment and assessment boards

Administration and management

Academic staff (as candidates)

Share your feedback on the framework

What is your role?

* While it is likely that you hold different roles at the same time in your institution, please choose the role that is more aligned with your intended use of this framework.

Leadership and governance

Recruitment and assessment boards

Administration and management

Academic staff (as candidates)

Share your feedback on the framework

What is your role?

* While it is likely that you hold different roles at the same time in your institution, please choose the role that is more aligned with your intended use of this framework.

Leadership and governance

Recruitment and assessment boards

Administration and management

Academic staff (as candidates)

Share your feedback on the framework

What is your role?

* While it is likely that you hold different roles at the same time in your institution, please choose the role that is more aligned with your intended use of this framework.

Leadership and governance

Recruitment and assessment boards

Administration and management

Academic staff (as candidates)

Share your feedback on the framework

What do you want to do?

Align organisational ACA processes with national or international initiatives

Promote transparency and fairness

Diversify and recognise career pathways

Integrate equality, diversity, inclusiveness in ACA processes

Foster engagement of the academic community in ACA reforms

Review assessment criteria

Share your feedback on the framework

Diversify and recognise career pathways

1. Broadening the scope of domains and activities considered in ACA processes

2. Defining new career pathways (taking into account different career stages)

1. Broadening the scope of domains and activities considered in ACA processes

DORA Rethinking Research Assessment - Building blocks for impact

CoARA - Agreement on Reforming Research Assessment

HR Excellence in Research

UNESCO Recommendation on Open Science

2. Defining new career pathways (taking into account different career stages)

Career Framework for University Teaching (Advancing Teaching Network)

EU ResearchComp

People and Teams UKRI Action Plan

HR Excellence in Research

NOR-CAM – A toolbox for recognition and rewards in academic careers

Review assessment criteria

1. Balancing the use of qualitative and quantitative indicators

2. Developing new indicators (qualitative and/or quantitative)

3. Avoiding the use of inappropriate metrics for assessing academics

1. Balancing the use of qualitative and quantitative indicators

The Hong Kong Principles for assessing researchers: fostering research integrity

2. Developing new indicators (qualitative and/or quantitative)

CoARA - Agreement on Reforming Research Assessment

3. Avoiding the use of inappropriate metrics for assessing academics

CoARA - Agreement on Reforming Research Assessment

CLACSO-FOLEC Declaration of principles

Publication Forum classification (Finland)

DORA Reformscape

INORMS - Tools for rethinking global university rankings

Promote transparency and fairness

1. Ensuring transparency and fairness in ACA processes

2. Informing candidates on assessment criteria

3. Providing training to assessors and assessees

1. Ensuring transparency and fairness in ACA processes

DORA - SPACE Rubric

SCOPE model

HR Excellence in Research

DORA - Ideas for Action

2. Informing candidates on assessment criteria

HR Excellence in Research

3. Providing training to assessors and assessees

Finland - Guidelines on Research Integrity (Responsible conduct of research)

Recognising and rewarding open research maturity framework

Foster engagement of the academic community in ACA reforms

1. Raising awareness on the need for reform

2. Engaging a variety of institutional actors in ACA reforms

1. Raising awareness on the need for reform

Interview (Netherlands - Rewards and Recognition)

Five good practices (Netherlands - Rewards and Recognition)

DORA Reformscape

DORA - Ideas for Action

2. Engaging a variety of institutional actors in ACA reforms

Open, Transparent and Merit-Based Recruitment of Researchers (OTM-R)

Recognising and rewarding open research maturity framework

Recognition and Rewards Dialogue toolkit

People and Teams UKRI Action Plan

Position Statement and Recommendations on Research Assessment Processes (Science Europe)

Helsinki Initiative on Multilingualism in Scholarly Communication

The Metric Tide Report

Integrate equality, diversity, inclusiveness in ACA processes

CLACSO-FOLEC Declaration of principles

HR Excellence in Research

DORA - Ideas for Action

DORA - SPACE Rubric

YUFE4Postdocs - Evaluation and Selection Procedure

People and Teams UKRI Action Plan

Align organisational ACA processes with national or international initiatives

Finland - Self-evaluation tool for culture of open scholarship services

DORA Ideas for Optimization: Five Things to Consider for Narrative CVs

Policy for Open Scholarship (Finland)

People and Teams UKRI Action Plan

Helsinki Initiative on Multilingualism in Scholarly Communication

UNESCO Recommendation on Open Science

What do you want to do?

Align organisational ACA processes with national or international initiatives

Promote transparency and fairness

Diversify and recognise career pathways

Integrate equality, diversity, inclusiveness in ACA processes

Foster engagement of the academic community in ACA reforms

Review assessment criteria

Share your feedback on the framework

Diversify and recognise career pathways

1. Broadening the scope of domains and activities considered in ACA processes

2. Defining new career pathways (taking into account different career stages)

1. Broadening the scope of domains and activities considered in ACA processes

The Hong Kong Principles for assessing researchers: fostering research integrity

Recognising and rewarding open research maturity framework

EU ResearchComp

Qualification Portfolio UMC Utrecht

2. Defining new career pathways (taking into account different career stages)

People and Teams UKRI Action Plan

Review assessment criteria

1. Balancing the use of qualitative and quantitative indicators

2. Developing new indicators (qualitative and/or quantitative)

3. Avoiding the use of inappropriate metrics for assessing academics

1. Balancing the use of qualitative and quantitative indicators

Stay tuned for tools and resources here.

2. Developing new indicators (qualitative and/or quantitative)

NOR-CAM – A toolbox for recognition and rewards in academic careers

DORA Rethinking Research Assessment - Building blocks for impact

UNESCO Recommendation on Open Science

Career Framework for University Teaching (Advancing Teaching Network)

3. Avoiding the use of inappropriate metrics for assessing academics

Stay tuned for tools and resources here.

Promote transparency and fairness

1. Ensuring transparency and fairness in ACA processes

2. Informing candidates on assessment criteria

3. Providing training to assessors and assessees

1. Ensuring transparency and fairness in ACA processes

YUFE4Postdocs - Evaluation and Selection Procedure

2. Informing candidates on assessment criteria

Open, Transparent and Merit-Based Recruitment of Researchers (OTM-R)

3. Providing training to assessors and assessees

Position Statement and Recommendations on Research Assessment Processes (Science Europe)

Career Framework for University Teaching (Advancing Teaching Network)

YUFE4Postdocs - Evaluation and Selection Procedure

UNESCO Recommendation on Open Science

UC Berkeley - Rubric to Assess Faculty Candidates

Foster engagement of the academic community in ACA reforms

1. Raising awareness on the need for reform

2. Engaging a variety of institutional actors in ACA reforms

1. Raising awareness on the need for reform

DORA - Unintended Cognitive and Systems Biases

Five good practices (Netherlands - Rewards and Recognition)

DORA - Ideas for Action

2. Engaging a variety of institutional actors in ACA reforms

Recognition and Rewards Dialogue toolkit

Integrate equality, diversity, inclusiveness in ACA processes

DORA Ideas for Optimization: Five Things to Consider for Narrative CVs

UC Berkeley - Rubric to Assess Faculty Candidates

People and Teams UKRI Action Plan

Align organisational ACA processes with national or international initiatives

Stay tuned for tools and resources here.

What do you want to do?

Align organisational ACA processes with national or international initiatives

Promote transparency and fairness

Diversify and recognise career pathways

Integrate equality, diversity, inclusiveness in ACA processes

Foster engagement of the academic community in ACA reforms

Review assessment criteria

Share your feedback on the framework

Diversify and recognise career pathways

1. Broadening the scope of domains and activities considered in ACA processes

2. Defining new career pathways (taking into account different career stages)

1. Broadening the scope of domains and activities considered in ACA processes

Guide for reviewers/evaluators using impact indicators (UMC Utrecht)

Helsinki Initiative on Multilingualism in Scholarly Communication

The Leiden Manifesto for research metrics

2. Defining new career pathways (taking into account different career stages)

Stay tuned for tools and resources here.

Review assessment criteria

1. Balancing the use of qualitative and quantitative indicators

2. Developing new indicators (qualitative and/or quantitative)

3. Avoiding the use of inappropriate metrics for assessing academics

1. Balancing the use of qualitative and quantitative indicators

NOR-CAM – A toolbox for recognition and rewards in academic careers

CoARA - Agreement on Reforming Research Assessment

Guide for reviewers/evaluators using impact indicators (UMC Utrecht)

DORA Reformscape

SEE more

DORA Reimagining Academic Career Assessment: Stories of Innovation and Change

Recognising and rewarding open research maturity framework

DORA - SPACE Rubric

1. Balancing the use of qualitative and quantitative indicators

Position Statement and Recommendations on Research Assessment Processes (Science Europe)

Open, Transparent and Merit-Based Recruitment of Researchers (OTM-R)

The Leiden Manifesto for research metrics

The Metric Tide Report

UC Berkeley - Rubric to Assess Faculty Candidates

2. Developing new indicators (qualitative and/or quantitative)

Guide for reviewers/evaluators using impact indicators (UMC Utrecht)

Qualification Portfolio UMC Utrecht

3. Avoiding the use of inappropriate metrics for assessing academics

The Hong Kong Principles for assessing researchers: fostering research integrity

Publication Forum classification (Finland)

The Leiden Manifesto for research metrics

The Metric Tide Report

NOR-CAM – A toolbox for recognition and rewards in academic careers

Promote transparency and fairness

1. Ensuring transparency and fairness in ACA processes

2. Informing candidates on assessment criteria

3. Providing training to assessors and assessees

1. Ensuring transparency and fairness in ACA processes

DORA - Debiasing Committee Composition and Deliberative Processes

Open, Transparent and Merit-Based Recruitment of Researchers (OTM-R)

Helsinki Initiative on Multilingualism in Scholarly Communication

The Hong Kong Principles for assessing researchers: fostering research integrity

Position Statement and Recommendations on Research Assessment Processes (Science Europe)

Guide for reviewers/evaluators using impact indicators (UMC Utrecht)

UC Berkeley - Rubric to Assess Faculty Candidates

The Leiden Manifesto for research metrics

2. Informing candidates on assessment criteria

Stay tuned for tools and resources here.

3. Providing training to assessors and assessees

DORA - Debiasing Committee Composition and Deliberative Processes

DORA - Practical Guide Evaluators

ANECA new code of ethics

SCOPE model

Foster engagement of the academic community in ACA reforms

1. Raising awareness on the need for reform

2. Engaging a variety of institutional actors in ACA reforms

1. Raising awareness on the need for reform

DORA - Unintended Cognitive and Systems Biases

DORA - Ideas for Action

2. Engaging a variety of institutional actors in ACA reforms

Stay tuned for tools and resources here.

Integrate equality, diversity, inclusiveness in ACA processes

Stay tuned for tools and resources here.

Align organisational ACA processes with national or international initiatives

Stay tuned for tools and resources here.

What do you want to do?

Align organisational ACA processes with national or international initiatives

Promote transparency and fairness

Diversify and recognise career pathways

Integrate equality, diversity, inclusiveness in ACA processes

Foster engagement of the academic community in ACA reforms

Review assessment criteria

Share your feedback on the framework

Diversify and recognise career pathways

1. Broadening the scope of domains and activities considered in ACA processes

2. Defining new career pathways (taking into account different career stages)

1. Broadening the scope of domains and activities considered in ACA processes

NOR-CAM – A toolbox for recognition and rewards in academic careers

Career Framework for University Teaching (Advancing Teaching Network)

Qualification Portfolio UMC Utrecht

2. Defining new career pathways (taking into account different career stages)

NOR-CAM – A toolbox for recognition and rewards in academic careers

DORA Rethinking Research Assessment - Building blocks for impact

Review assessment criteria

1. Balancing the use of qualitative and quantitative indicators

2. Developing new indicators (qualitative and/or quantitative)

3. Avoiding the use of inappropriate metrics for assessing academics

1. Balancing the use of qualitative and quantitative indicators

Stay tuned for tools and resources here.

2. Developing new indicators (qualitative and/or quantitative)

Stay tuned for tools and resources here.

3. Avoiding the use of inappropriate metrics for assessing academics

Stay tuned for tools and resources here.

Promote transparency and fairness

1. Ensuring transparency and fairness in ACA processes

2. Informing candidates on assessment criteria

3. Providing training to assessors and assessees

1. Ensuring transparency and fairness in ACA processes

Finland - The researcher’s curriculum vitae (CV)

ANECA narrative CV template

2. Informing candidates on assessment criteria

ANECA new accreditation procedure

SCOPE model

3. Providing training to assessors and assessees

Finland - Guidelines on Research Integrity (Responsible conduct of research)

People and Teams UKRI Action Plan

Foster engagement of the academic community in ACA reforms

1. Raising awareness on the need for reform

2. Engaging a variety of institutional actors in ACA reforms

1. Raising awareness on the need for reform

DORA - Unintended Cognitive and Systems Biases

DORA - Ideas for Action

2. Engaging a variety of institutional actors in ACA reforms

Researchers' views on diversity of career assessment criteria in Finland: a survey report

EU ResearchComp

Recognition and Rewards Dialogue toolkit

Integrating equality, diversity, inclusiveness in ACA processes

Stay tuned for tools and resources here.

Align organisational ACA processes with national or international initiatives

Stay tuned for tools and resources here.

Resources

Practical tools

Tools

Share your feedback on the framework

Practical tools

Practical, adaptable methods or instruments used to directly evaluate, measure, or improve specific aspects of an academic career. Designed for action and change in assessment practices.

DORA Rethinking Research Assessment - Building blocks for impact

Guide for reviewers/evaluators using impact indicators (UMC Utrecht)

YUFE4Postdocs - Evaluation and Selection Procedure

Recognition and Rewards Dialogue toolkit

Recognising and rewarding open research maturity framework

DORA Ideas for Optimization: Five Things to Consider for Narrative CVs

SEE more

EU ResearchComp

Finland - Self-evaluation tool for culture of open scholarship services

DORA - SPACE Rubric

SCOPE model

Practical tools

Practical, adaptable methods or instruments used to directly evaluate, measure, or improve specific aspects of an academic career. Designed for action and change in assessment practices.

NOR-CAM – A toolbox for recognition and rewards in academic careers

Career Framework for University Teaching (Advancing Teaching Network)

ANECA narrative CV template

Qualification Portfolio UMC Utrecht

DORA - Debiasing Committee Composition and Deliberative Processes

Open, Transparent and Merit-Based Recruitment of Researchers (OTM-R)

Finland - The researcher’s curriculum vitae (CV)

ANECA new accreditation procedure

Resources

Informative materials that provide background, context, or guidance on academic career assessment. Support reflection and help inform the use or development of tools.

CoARA - Agreement on Reforming Research Assessment

Finland - Guidelines on Research Integrity (Responsible conduct of research)

CLACSO-FOLEC Declaration of principles

People and Teams UKRI Action Plan

SEE more

Five good practices (Netherlands - Rewards and Recognition)

DORA Reformscape

Publication Forum classification (Finland)

INORMS - Tools for rethinking global university rankings

DORA - Ideas for Action

Policy for Open Scholarship (Finland)

Resources

Informative materials that provide background, context, or guidance on academic career assessment. Support reflection and help inform the use or development of tools.

DORA - Unintended Cognitive and Systems Biases

Researchers' views on diversity of career assessment criteria in Finland: a survey report

Interview (Netherlands - Rewards and Recognition)

ANECA new code of ethics

SEE more

DORA Reimagining Academic Career Assessment: Stories of Innovation and Change

The Leiden Manifesto for research metrics

DORA - Practical Guide Evaluators

Resources

Informative materials that provide background, context, or guidance on academic career assessment. Support reflection and help inform the use or development of tools.

Position Statement and Recommendations on Research Assessment Processes (Science Europe)

Helsinki Initiative on Multilingualism in Scholarly Communication

HR Excellence in Research

UNESCO Recommendation on Open Science

The Hong Kong Principles for assessing researchers: fostering research integrity

Publication Forum classification (Finland)

The Metric Tide Report

Help us improve this framework:

Share your feedback

Overview

The document builds on a 2019 study by Science Europe and Technopolis Group, examining how research organisations conduct assessment for funding and career progression. Its purpose is to strengthen existing processes—ensuring they are effective, efficient, fair, transparent, and aligned with evolving research practices such as open science and AI. The Statement issues seven core recommendations covering: Transparency; Evaluating robustness; Bias mitigation; Cost/efficiency; Reviewer diversity; Qualitative assessment; and Novel approaches.

More info

Overview

The SPACE rubric is a tool developed by DORA in collaboration with Ruth Schmidt to help academic institutions at any stage of academic assessment reform measure and improve their institutional ability to support the development and implementation of new academic assessment practices and activities. It focuses on five core capabilities: Standards for Scholarship, Process Mechanics and Policies, Accountability, Culture within Institutions, and Evaluative and Iterative Feedback. Within each category, three levels of institutional “maturity” are considered (foundation, expansion and scaling), allowing each organisation to establish their baseline. The rubric can also be used to retroactively analyze how strengths or gaps in institutional conditions may have impacted the outcomes of concrete interventions targeted to specific types of academic assessment activities—such as hiring, promotion, tenure, or even graduate student evaluation—either helping or hindering progress toward those goals.

More info

Balancing the use of qualitative and quantitative indicators

Promoting a more holistic assessment approach, reflecting the breadth and depth of academic contributions, that integrates a qualitative dimension (eg. expert judgment, narratives) with metrics used appropriately and responsibly.

Encouraging the need for clarity, consistency, and impartiality in assessment processes to build trust and accountability.

Recruitment and assessment boards

Responsible for conducting the academic evaluations, by reviewing applications, and assessing qualifications. Responsible for making recommendations or decisions for appointment or promotion. Examples of roles: Faculty recruitment committees; Promotion and tenure committees; Ethics & Diversity committees; External peer-reviewers.

Why is this relevant?

The aim of the guideline is to promote good and responsible research practices and to prevent violations of research integrity in all academic disciplines. It is targeted for research organisations, researchers and students in Finnish higher education.

Administration and management

Responsible for operationalising academic assessment processes by implementing procedures and ensuring compliance with policies and regulations. Examples of roles: Deans of faculties/schools; Department chairs/heads; Academic Affairs Office; Human Resources (HR) Department

Why is this relevant?

Career breaks or variations in the chronological order of CVs should not be penalised, but regarded as an evolution of a career.

Overview

The SCOPE framework is a practical, step-by-step model developed by the International Network of Research Management Societies (INORMS) Research Evaluation Group to guide responsible research evaluation. It aims to bridge the gap between high-level principles (like DORA and the Leiden Manifesto) and their practical implementation in research assessment. SCOPE is a five-stage process:

  1. Start with what you value: Identify and articulate the core values and objectives driving the evaluation.
  2. Context considerations: Understand the specific context, including disciplinary norms and institutional goals.
  3. Options for measuring: Explore various qualitative and quantitative methods suitable for the evaluation.
  4. Probe deeply: Critically assess chosen methods for biases and unintended consequences.
  5. Evaluate your evaluation: Reflect on the evaluation process to ensure it aligns with initial values and objectives.
This framework encourages evaluations that are value-driven, context-sensitive, and methodologically sound.

More info

Raising awareness on the need for reform

Building institutional momentum by engaging academic communities in discussions about the limitations of current systems and the potential benefits of reform.

Overview

The Metric Tide (HEFCE, July 2015), led by Prof. James Wilsdon, provides a comprehensive, evidence-based evaluation of how quantitative metrics—like citation counts, altmetrics, journal impact factors—are used and misused in research assessment and management. It balances peer review with metrics, exploring historical trends, disciplinary differences, and the unintended consequences of metric-driven cultures, such as “gaming” behaviours and impacts on equality, diversity, and interdisciplinarity.

More info

Encouraging institutions to align their assessment frameworks and practices with broader frameworks and principles at national or international levels.

Overview

The ANECA (National Agency for Quality Assessment and Accreditation of Spain) accreditation framework establishes the evaluation criteria for academic career progression in Spain. It provides a structured approach for assessing teaching, research, and institutional contributions for faculty accreditation at different career levels (Lecturer and University Professor). The tool ensures transparency, consistency, and merit-based assessment in academic hiring and promotions. The tool developed evaluation criteria to 4 dimensions for merits and competences: A. Research, transfer and exchange of knowledge, B. Teaching, C. Leadership and D. Professional Activity.

More info

Engaging a variety of institutional actors in ACA reforms

Promoting collective ownership of assessment change processes by involving various institutional actors (eg. leadership, HR, funders, academics, and support staff) in co-design and implementation.

Focus on expanding the assessment of academic careers to include a wider range of roles, activities, and contributions. Emphasising meriting multiple, flexible career paths that reflect diverse profiles and professional trajectories.

Overview

The Metric Tide (HEFCE, July 2015), led by Prof. James Wilsdon, provides a comprehensive, evidence-based evaluation of how quantitative metrics—like citation counts, altmetrics, journal impact factors—are used and misused in research assessment and management. It balances peer review with metrics, exploring historical trends, disciplinary differences, and the unintended consequences of metric-driven cultures, such as “gaming” behaviours and impacts on equality, diversity, and interdisciplinarity.

More info

Why is this relevant?

Each Researcher can use the Framework as a starting point to assess his/her competencies and address his/her own needs. Provides guidance, helps understand and promote the cultural change needed.

Overview

The Publication Forum (JUFO) classification, developed by the Finnish scientific community, evaluates the average quality of academic publication channels—such as journals, book publishers, and conferences—by assigning them to levels 0 to 3, with level 3 indicating the highest quality. Its primary purpose is to support the assessment of research output quality at the institutional level, particularly within Finland's university funding model. The classification emphasizes responsible research evaluation practices, aligning with international standards like DORA, the Leiden Manifesto, and the Metric Tide report.

More info

Why is this relevant?

This document identifies strategies for including more perspectives and reducing biases in the evaluation processes for hiring, promotion, tenure, and funding decisions.

Overview

The Recognition & Rewards Dialogue Toolkit is a structured guide developed to support Dutch research institutions in fostering meaningful conversations about academic career assessment. It is a central component of the national Recognition & Rewards programme, which aims to broaden the criteria for recognising academic contributions beyond traditional metrics, encompassing areas like teaching, leadership, societal impact, and collaboration. The Toolkit provides a step-by-step framework for organising dialogue sessions within institutions. It guides users through defining goals, selecting participants, designing discussions, and facilitating follow-up actions. The emphasis is on addressing 'adaptive' challenges—complex issues without clear solutions—through iterative, inclusive dialogue.

More info

Why is this relevant?

The CoARA Agreement explicitly calls on institutions to recognise a broader array of academic activities beyond research publications—such as teaching, mentoring, leadership, open science, and societal engagement—as valuable and valid contributions in academic career assessment. Institutional leadership is the primary enabler of structural, cultural, and procedural shifts needed to broaden the scope of academic assessment.

Overview

The Recognition & Rewards Dialogue Toolkit is a structured guide developed to support Dutch research institutions in fostering meaningful conversations about academic career assessment. It is a central component of the national Recognition & Rewards programme, which aims to broaden the criteria for recognising academic contributions beyond traditional metrics, encompassing areas like teaching, leadership, societal impact, and collaboration. The Toolkit provides a step-by-step framework for organising dialogue sessions within institutions. It guides users through defining goals, selecting participants, designing discussions, and facilitating follow-up actions. The emphasis is on addressing 'adaptive' challenges—complex issues without clear solutions—through iterative, inclusive dialogue.

More info

Rethinking what is valued and measured in academic assessment, moving beyond traditional metrics to more holistic evaluation methods, profiles and professional trajectories.

Overview

The Recognition & Rewards Dialogue Toolkit is a structured guide developed to support Dutch research institutions in fostering meaningful conversations about academic career assessment. It is a central component of the national Recognition & Rewards programme, which aims to broaden the criteria for recognising academic contributions beyond traditional metrics, encompassing areas like teaching, leadership, societal impact, and collaboration. The Toolkit provides a step-by-step framework for organising dialogue sessions within institutions. It guides users through defining goals, selecting participants, designing discussions, and facilitating follow-up actions. The emphasis is on addressing 'adaptive' challenges—complex issues without clear solutions—through iterative, inclusive dialogue.

More info

Overview

The Recognition & Rewards Dialogue Toolkit is a structured guide developed to support Dutch research institutions in fostering meaningful conversations about academic career assessment. It is a central component of the national Recognition & Rewards programme, which aims to broaden the criteria for recognising academic contributions beyond traditional metrics, encompassing areas like teaching, leadership, societal impact, and collaboration. The Toolkit provides a step-by-step framework for organising dialogue sessions within institutions. It guides users through defining goals, selecting participants, designing discussions, and facilitating follow-up actions. The emphasis is on addressing 'adaptive' challenges—complex issues without clear solutions—through iterative, inclusive dialogue.

More info

Overview

UC Berkeley’s rubric aims to standardize and increase transparency, equity, and consistency in faculty searches. It provides a structured template where committees define categories (e.g. Research, Teaching, DEI), assign explicit scoring scales (typically 1–5), and clarify evidence that informs how scores are assigned. A core element is the calibration exercise, where committee members pre-score sample applications, compare outcomes, discuss discrepancies, and recalibrate scoring norms before evaluating the full candidate pool. The framework includes sample categories: Research: Productivity, alignment, plans. Teaching & Mentoring: Experience, interests, mentoring approach. A separate rubric addresses Contributions to Diversity, Equity, and Inclusion (DEI).

More info

Overview

Rethinking Research Assessment: Ideas for Action is a resource developed by DORA that outlines five common myths about research evaluation, which frequently relies on indicators like Journal Impact Factor (JIF) and similar measures as proxies for quality in research, promotion, and tenure decisions. To counter these myths, the resource offers five design principles to help research intensive institutions experiment with and develop better research assessment practices: 1. Instill standards and structure into research assessment processes. 2. Foster a sense of personal accountability among faculty and staff. 3. Prioritize equity and transparency in research assessment. 4. Take a big picture or portfolio view toward researcher contributions. 5. Refine research assessment processes through iterative feedback.

More info

Why is this relevant?

The OTM‑R framework urges institutions to involve diverse stakeholders (e.g., international experts, HR, research units) and share best practices across units. Institutional leadership is critical for coordinating these multi-actor, strategic reforms.

Overview

ResearchComp is a European framework outlining the transversal skills that researchers need for successful and interoperable careers in all sectors of the society (academia, industry, business, public administration, NGOs etc.). It establishes a common language and a common understanding of researchers’ transversal competences in areas such as knowledge creation, communication, leadership, and impact. The framework aims to align research careers with broader labor market needs, helping institutions, funders, and policymakers establish transparent career progression pathways.

More info

Why is this relevant?

DORA Rethinking Research Assessment: Building Blocks for Impact illustrates the wide variety of academic achievements and outcomes that could be considered “impactful”.

Overview

The Initiative advocates that scholarly communication should not be monolingual. It calls for balanced multilingual dissemination to ensure both global excellence and local relevance of research.Core Principles: Support researchers in sharing results beyond academia in varied languages. Safeguard national-language publishing infrastructure and support the transition to Open Access. Integrate language diversity into research evaluation, funding, and metrics.

More info

Providing training to assessors and assessees

Covers initiatives for ongoing capacity-building for evaluators and candidates to enhance understanding and application of new assessment principles, criteria and practices.

Overview

The Finnish 'Policy for Open Scholarship' provides a national framework to support open science practices across research institutions. Its aim is to embed openness into the research culture, fostering transparency, reproducibility, and societal impact. Key pillars include open access to research publications, data, methods, and education. The policy encourages systemic change in research evaluation, rewarding openness in academic careers.

More info

Administration and management

Responsible for operationalising academic assessment processes by implementing procedures and ensuring compliance with policies and regulations. Examples of roles: Deans of faculties/schools; Department chairs/heads; Academic Affairs Office; Human Resources (HR) Department

Providing training to assessors and assessees

Covers initiatives for ongoing capacity-building for evaluators and candidates to enhance understanding and application of new assessment principles, criteria and practices.

Developing new indicators (qualitative and/or quantitative)

Creating and using innovative indicators that capture diverse contributions.

Overview

The Self-Evaluation Tool for Culture of Open Scholarship Services assists research organizations in assessing and developing their open science services. It supports the Policy for Open Scholarship by providing concrete measures and criteria—ranging from minimum to optimal levels—across areas such as evaluation, education, research data, and publications. The tool facilitates alignment with national open science policies and contributes to the national monitoring model for open science maturity.

More info

Why is this relevant?

Case studies show how some institutions have pivoted from purely metric-driven assessment to narrative CVs, peer judgment, and other contextual evaluations. This equips boards with practical alternatives and frameworks that can be adopted.

Help us improve this framework — share your feedback!

Overview

The Norwegian Career Assessment Matrix (NOR-CAM) is a framework developed by Universities Norway (UHR) to enhance the recognition and rewards system in academic careers. It aims to provide a more flexible and holistic approach to evaluating academic performance by emphasizing transparency, breadth, and systematic comprehensive assessments, supplementing responsible use of quantitative metrics of scientific publications. NOR-CAM encourages the assessment in five competency areas:, 1) research output, 2) research process, 3) pedagogical competencies, 2) societal impact, 4) academic leadership and 5) other competencies, as well as aligning evaluations with Open Science principles.

More info

Informing candidates on assessment criteria

Importance of providing candidates with clear and accessible information about what is expected, which criteria are used and how assessments are conducted.

Why is this relevant?

The Initiative targets multiple actors—including policy-makers, funders, universities, libraries, and researchers—to collaboratively support multilingual communication. Institutional leadership should engage these stakeholders to co-develop multilingual-friendly assessment practices.

Balancing the use of qualitative and quantitative indicators

Promoting a more holistic assessment approach, reflecting the breadth and depth of academic contributions, that integrates a qualitative dimension (eg. expert judgment, narratives) with metrics used appropriately and responsibly.

Why is this relevant?

Reformscape functions as a learning and advocacy tool for institutions seeking reform. It raises awareness through real-world examples, comparative insights, and benchmarking, which are all critical to building institutional momentum for change.

Engaging a variety of institutional actors in ACA reforms

Promoting collective ownership of assessment change processes by involving various institutional actors (eg. leadership, HR, funders, academics, and support staff) in co-design and implementation.

Overview

The Researcher’s Curriculum Vitae (CV) Template, developed by the Finnish National Board on Research Integrity (TENK) in collaboration with Universities Finland UNIFI, the Rectors’ Conference of Finnish Universities of Applied Sciences Arene, and the Academy of Finland, aims to standardize the presentation of researchers' qualifications and achievements. Its primary purpose is to ensure that CVs comprehensively, truthfully, and comparably reflect an individual's studies, professional career, academic merits, and other accomplishments. This standardization promotes responsible conduct of research by providing clear guidelines on how to document and present one's credentials.

More info

Overview

Rethinking Research Assessment: Ideas for Action is a resource developed by DORA that outlines five common myths about research evaluation, which frequently relies on indicators like Journal Impact Factor (JIF) and similar measures as proxies for quality in research, promotion, and tenure decisions. To counter these myths, the resource offers five design principles to help research intensive institutions experiment with and develop better research assessment practices: 1. Instill standards and structure into research assessment processes. 2. Foster a sense of personal accountability among faculty and staff. 3. Prioritize equity and transparency in research assessment. 4. Take a big picture or portfolio view toward researcher contributions. 5. Refine research assessment processes through iterative feedback.

More info

Overview

This resource by DORA aims to help identify and mitigate biases in research assessment. It outlines seven personal biases that can affect hiring, promotion, and tenure decisions, as well as four institutional and infrastructural implications of these biases. It also provides strategies to develop new institutional conditions that reduce bias.

More info

Overview

The Finnish Guidelines on Research Integrity (RI Guidelines), issued by the Finnish National Board on Research Integrity (TENK) in 2023, aim to promote good research practices and a responsible research culture across all disciplines. They provide a framework for self-regulation within the research community, outlining procedures for handling alleged violations of research integrity. Key updates include shortened investigation timelines, the introduction of Research Integrity Advisers, alignment with international classification of violations, and the inclusion of severity assessments for violations.

More info

Overview

The SPACE rubric is a tool developed by DORA in collaboration with Ruth Schmidt to help academic institutions at any stage of academic assessment reform measure and improve their institutional ability to support the development and implementation of new academic assessment practices and activities. It focuses on five core capabilities: Standards for Scholarship, Process Mechanics and Policies, Accountability, Culture within Institutions, and Evaluative and Iterative Feedback. Within each category, three levels of institutional “maturity” are considered (foundation, expansion and scaling), allowing each organisation to establish their baseline. The rubric can also be used to retroactively analyze how strengths or gaps in institutional conditions may have impacted the outcomes of concrete interventions targeted to specific types of academic assessment activities—such as hiring, promotion, tenure, or even graduate student evaluation—either helping or hindering progress toward those goals.

More info

Encouraging the need for clarity, consistency, and impartiality in assessment processes to build trust and accountability.

Ensuring transparency and fairness in ACA processes

Clearly defined, openly communicated procedures and criteria that ensure equitable treatment of candidates.

Recognising the importance of involving the wide academic community, at all seniority levels, in reflections, design and implementation of academic career assessment reforms.

Why is this relevant?

The survey is intended for researchers to identify which competencies and skills should be considered in assessments.

Overview

The document builds on a 2019 study by Science Europe and Technopolis Group, examining how research organisations conduct assessment for funding and career progression. Its purpose is to strengthen existing processes—ensuring they are effective, efficient, fair, transparent, and aligned with evolving research practices such as open science and AI. The Statement issues seven core recommendations covering: Transparency; Evaluating robustness; Bias mitigation; Cost/efficiency; Reviewer diversity; Qualitative assessment; and Novel approaches.

More info

Why is this relevant?

Overall, the Plan concerns the development of a strategy on career pathways and career advancement for academic and technical staff. Administration and mamagement can refer to these guidelines to assess their progress in terms of criteria adopted in administrative and managerial decision making.

Why is this relevant?

This tool offers clear guidance in developing and evaluating assessment critera. Recruitment and assessment boards can use this tool to refine their criteria and train assessors and assessees

Overview

The Self-Evaluation Tool for Culture of Open Scholarship Services assists research organizations in assessing and developing their open science services. It supports the Policy for Open Scholarship by providing concrete measures and criteria—ranging from minimum to optimal levels—across areas such as evaluation, education, research data, and publications. The tool facilitates alignment with national open science policies and contributes to the national monitoring model for open science maturity.

More info

Leadership and governance

Responsible for defining the overarching principles, strategy and criteria guiding academic assessment. Endorsing final decisions and overseeing assessment outcomes. Examples of roles: Rector / President / Director; Vice-Rector / Vice-President; Deans of faculties/schools; Department chairs/heads

Overview

The Career Framework for University Teaching offers a structured, evidence-based pathway to recognise and reward teaching contributions alongside research. Developed between 2015–2018 by the Royal Academy of Engineering and global partners, it outlines four progressive levels: 1. Effective teacher 2. Skilled & collegial teacher 3. Institutional leader or Scholarly teacher (dual paths) 4. National & global leader For each level, it defines (a) sphere of impact, (b) promotion criteria, and (c) forms of evidence to demonstrate achievement.

More info

Overview

By endorsing the UNESCO Recommendation, institutions align their assessment frameworks with an international standard for open science, supporting multi‑stakeholder collaboration and global consistency. This strategic alignment must be driven by institutional leadership.

More info

Providing training to assessors and assessees

Covers initiatives for ongoing capacity-building for evaluators and candidates to enhance understanding and application of new assessment principles, criteria and practices.

Ensuring transparency and fairness in ACA processes

Clearly defined, openly communicated procedures and criteria that ensure equitable treatment of candidates.

Rethinking what is valued and measured in academic assessment, moving beyond traditional metrics to more holistic evaluation methods, profiles and professional trajectories.

Overview

The Recognising and Rewarding Open Research Maturity Framework and Self-Assessment Tool was developed to support research organisations in evaluating and improving their approaches to recognising and rewarding open research practices. The tool provides a structured self-assessment framework to help institutions understand their current maturity levels and identify areas for enhancement. It promotes cultural and procedural change in research assessment, aligning with principles of transparency, inclusivity, and integrity.

More info

Ensuring transparency and fairness in ACA processes

Clearly defined, openly communicated procedures and criteria that ensure equitable treatment of candidates.

Overview

The Norwegian Career Assessment Matrix (NOR-CAM) is a framework developed by Universities Norway (UHR) to enhance the recognition and rewards system in academic careers. It aims to provide a more flexible and holistic approach to evaluating academic performance by emphasizing transparency, breadth, and systematic comprehensive assessments, supplementing responsible use of quantitative metrics of scientific publications. NOR-CAM encourages the assessment in five competency areas:, 1) research output, 2) research process, 3) pedagogical competencies, 2) societal impact, 4) academic leadership and 5) other competencies, as well as aligning evaluations with Open Science principles.

More info

Why is this relevant?

UNESCO encourages institutions to value open science practices—such as open access, FAIR data, multilingual dissemination, and citizen science—as integral scholarly contributions. Institutional leadership must formally recognise and integrate these into criteria guiding career assessment.

Why is this relevant?

ANECA new code of Ethics provides new values and behaviour that the persons related with the ANECAs activities nee d to take into account in their task and jobs.

Engaging a variety of institutional actors in ACA reforms

Promoting collective ownership of assessment change processes by involving various institutional actors (eg. leadership, HR, funders, academics, and support staff) in co-design and implementation.

Balancing the use of qualitative and quantitative indicators

Promoting a more holistic assessment approach, reflecting the breadth and depth of academic contributions, that integrates a qualitative dimension (eg. expert judgment, narratives) with metrics used appropriately and responsibly.

Overview

UC Berkeley’s rubric aims to standardize and increase transparency, equity, and consistency in faculty searches. It provides a structured template where committees define categories (e.g. Research, Teaching, DEI), assign explicit scoring scales (typically 1–5), and clarify evidence that informs how scores are assigned. A core element is the calibration exercise, where committee members pre-score sample applications, compare outcomes, discuss discrepancies, and recalibrate scoring norms before evaluating the full candidate pool. The framework includes sample categories: Research: Productivity, alignment, plans. Teaching & Mentoring: Experience, interests, mentoring approach. A separate rubric addresses Contributions to Diversity, Equity, and Inclusion (DEI).

More info

Overview

The HR Excellence in Research (HRS4R) Award is a recognition granted by the European Commission to research institutions committed to improving working conditions for researchers. It supports the implementation of the European Charter for Researchers and the Code of Conduct for the Recruitment of Researchers (Charter & Code), fostering transparent recruitment and professional development opportunities. The tool is voluntary but offers institutions a structured framework to enhance their HR policies in alignment with European best practices.

More info

Raising awareness on the need for reform

Building institutional momentum by engaging academic communities in discussions about the limitations of current systems and the potential benefits of reform.

Overview

The SPACE rubric is a tool developed by DORA in collaboration with Ruth Schmidt to help academic institutions at any stage of academic assessment reform measure and improve their institutional ability to support the development and implementation of new academic assessment practices and activities. It focuses on five core capabilities: Standards for Scholarship, Process Mechanics and Policies, Accountability, Culture within Institutions, and Evaluative and Iterative Feedback. Within each category, three levels of institutional “maturity” are considered (foundation, expansion and scaling), allowing each organisation to establish their baseline. The rubric can also be used to retroactively analyze how strengths or gaps in institutional conditions may have impacted the outcomes of concrete interventions targeted to specific types of academic assessment activities—such as hiring, promotion, tenure, or even graduate student evaluation—either helping or hindering progress toward those goals.

More info

Raising awareness on the need for reform

Building institutional momentum by engaging academic communities in discussions about the limitations of current systems and the potential benefits of reform.

Why is this relevant?

The Framework encourages academics to document and evidence contributions to teaching and learning—such as pedagogical innovation, student outcomes, peer mentoring, and leadership in educational practice—beyond traditional research metrics.

Why is this relevant?

This tool offers principles to evaluate good assessment. Leadership and governance can use this to evaluate and/or reform their assessment criteria.

Overview

The Agencia Nacional de EvaluaciĂłn de la Calidad y AcreditaciĂłn (ANECA) introduced a revised Code of Ethics on November 22, 2023, to enhance the integrity and transparency of the public servants that are involve of the evaluation and accreditation processes. This updated code consolidates previous ethical guidelines into a unified framework, emphasizing principles such as legality, objectivity, independence, transparency, and professional ethics. It aims to ensure that all personnel and collaborators adhere to high ethical standards, fostering trust in ANECA's operations.

More info

Administration and management

Responsible for operationalising academic assessment processes by implementing procedures and ensuring compliance with policies and regulations. Examples of roles: Deans of faculties/schools; Department chairs/heads; Academic Affairs Office; Human Resources (HR) Department

Academic staff

Responsible for presenting evidence of academic achievements and competencies for assessment, in line with institutional criteria. Examples of roles: Early-career researcher (R1); Recognised researcher (R2); Established researcher (R3); Leading reearcher (R4).

Why is this relevant?

The HKP emphasize avoiding narrow publication-based indicators (like impact factors) and instead assessing researchers on their contribution to integrity, openness, and collaboration. Recruitment and assessment boards are directly responsible for implementing these revised evaluation criteria.

Overview

This resource by DORA aims to help identify and mitigate biases in research assessment. It outlines seven personal biases that can affect hiring, promotion, and tenure decisions, as well as four institutional and infrastructural implications of these biases. It also provides strategies to develop new institutional conditions that reduce bias.

More info

Encouraging the need for clarity, consistency, and impartiality in assessment processes to build trust and accountability.

Overview

The Initiative advocates that scholarly communication should not be monolingual. It calls for balanced multilingual dissemination to ensure both global excellence and local relevance of research.Core Principles: Support researchers in sharing results beyond academia in varied languages. Safeguard national-language publishing infrastructure and support the transition to Open Access. Integrate language diversity into research evaluation, funding, and metrics.

More info

Help us improve this framework — share your feedback!

Why is this relevant?

The Statement emphasizes that assessment processes must be transparent and free of bias at all stages and that proactively guard against bias and discrimination. Part of this is the responsibility of assessment panels, who must design and execute selection processes that are equitable.

Overview

Developed by UMC Utrecht, this guide shifts focus from output-based evaluations to formative assessments that prioritize societal impact. It encourages evaluators to consider the purpose and process of research, asking "why are you doing this research?" instead of "what have you measurably produced?"

More info

Overview

The SPACE rubric is a tool developed by DORA in collaboration with Ruth Schmidt to help academic institutions at any stage of academic assessment reform measure and improve their institutional ability to support the development and implementation of new academic assessment practices and activities. It focuses on five core capabilities: Standards for Scholarship, Process Mechanics and Policies, Accountability, Culture within Institutions, and Evaluative and Iterative Feedback. Within each category, three levels of institutional “maturity” are considered (foundation, expansion and scaling), allowing each organisation to establish their baseline. The rubric can also be used to retroactively analyze how strengths or gaps in institutional conditions may have impacted the outcomes of concrete interventions targeted to specific types of academic assessment activities—such as hiring, promotion, tenure, or even graduate student evaluation—either helping or hindering progress toward those goals.

More info

Overview

ResearchComp is a European framework outlining the transversal skills that researchers need for successful and interoperable careers in all sectors of the society (academia, industry, business, public administration, NGOs etc.). It establishes a common language and a common understanding of researchers’ transversal competences in areas such as knowledge creation, communication, leadership, and impact. The framework aims to align research careers with broader labor market needs, helping institutions, funders, and policymakers establish transparent career progression pathways.

More info

Defining new career pathways (taking into account different career stages)

Developing flexible, non-linear, and hybrid career trajectories that reflect individual strengths, institutional needs, movement across sectors and evolving academic roles at all stages of career progression.

Why is this relevant?

NOR-CAM provides a structured framework to broaden academic assessments by recognizing diverse contributions beyond research metrics, supporting transparent, institution-wide implementation of inclusive evaluation practices.

Ensuring that assessment criteria and procedures actively value and accommodate different backgrounds, career paths and life circumstances, reducing biases.

Overview

The Recommendation offers the first international framework defining open science, its values, principles and actionable pathways toward fair and equitable scientific openness. It seeks to reduce knowledge and technology gaps and bolster science as a global public good. It emphasizes inclusiveness, transparency, reproducibility, and engagement with broader society.

More info

Why is this relevant?

The user guide contains the recommendations on the responsible use of the Publication Forum classification system to assist in the evaluation of research output.

Overview

The Coalition for Advancing Research Assessment (CoARA) Agreement aims to reform research assessment practices by shifting away from quantitative metrics (e.g., journal impact factors) towards a more qualitative and holistic approach. It promotes recognition of diverse research contributions, including Open Science, societal impact, and team-based research. The agreement is voluntary and seeks commitment from research institutions, funders, and other stakeholders to implement responsible assessment practices.

More info

Overview

The Hong Kong Principles (HKPs) were formulated at the 6th World Conference on Research Integrity (June 2019) and published in PLOS Biology in July 2020. They aim to shift research assessment from quantity-based metrics to behaviours that underpin trustworthiness—rigor, transparency, and openness. The framework outlines five principles:

  • Responsible research practices
  • Transparent reporting
  • Open science
  • Valuing a diversity of research types
  • Recognizing broader scholarly contributions
Each principle is accompanied by a rationale and real-world examples where they’ve been adopted.

More info

Why is this relevant?

The framework focuses heavily on evolving assessment practices to move beyond simplistic metrics (like publication counts) and encourages incorporating narrative CVs, expert judgment, and evidence of open research.

Why is this relevant?

The framework proposes evaluation based on merit, judged through both qualitative competencies and objective measures. Assessment panels must combine narrative judgments (e.g., interviews, statements) with quantitative criteria in assessment processes.

Engaging a variety of institutional actors in ACA reforms

Promoting collective ownership of assessment change processes by involving various institutional actors (eg. leadership, HR, funders, academics, and support staff) in co-design and implementation.

Avoiding the use of inappropriate metrics for assessing academics

Moving away from using journal- and publication-based metrics (e.g. JIF, AIS, h-index) in academic career assessment as a proxy for quality and impact.

Raising awareness on the need for reform

Building institutional momentum by engaging academic communities in discussions about the limitations of current systems and the potential benefits of reform.

Overview

The document builds on a 2019 study by Science Europe and Technopolis Group, examining how research organisations conduct assessment for funding and career progression. Its purpose is to strengthen existing processes—ensuring they are effective, efficient, fair, transparent, and aligned with evolving research practices such as open science and AI. The Statement issues seven core recommendations covering: Transparency; Evaluating robustness; Bias mitigation; Cost/efficiency; Reviewer diversity; Qualitative assessment; and Novel approaches.

More info

Why is this relevant?

The Guide codifies explicit criteria (structure, process, outcomes) and example indicators, enabling evaluators to apply consistent, transparent standards in impact assessment.

Overview

This resource by DORA aims to help identify and mitigate biases in research assessment. It outlines seven personal biases that can affect hiring, promotion, and tenure decisions, as well as four institutional and infrastructural implications of these biases. It also provides strategies to develop new institutional conditions that reduce bias.

More info

Overview

The Initiative advocates that scholarly communication should not be monolingual. It calls for balanced multilingual dissemination to ensure both global excellence and local relevance of research.Core Principles: Support researchers in sharing results beyond academia in varied languages. Safeguard national-language publishing infrastructure and support the transition to Open Access. Integrate language diversity into research evaluation, funding, and metrics.

More info

Why is this relevant?

UNESCO highlights the need for capacity-building—training researchers, assessors, and institutional staff in open science principles, ethics, and infrastructure use. This empowers evaluators and candidates to respect and apply open science values in assessment

Overview

The People and Teams UKRI Action Plan is designed to enhance the UK’s research and innovation system by fostering a positive and inclusive research culture. It focuses on supporting diverse career pathways, improving recognition and reward systems, and strengthening collaboration across disciplines and sectors.

More info

Help us improve this framework — share your feedback!

Overview

DORA's 'Ideas for Optimization' provides guidance for optimizing, evaluating, and iterating on the use of narrative CVs to ensure comprehensive recognition of diverse research contributions. It focuses on practical implementation and continuous improvement to create fairer assessment mechanisms.

More info

Recognising the importance of involving the wide academic community, at all seniority levels, in reflections, design and implementation of academic career assessment reforms.

Rethinking what is valued and measured in academic assessment, moving beyond traditional metrics to more holistic evaluation methods, profiles and professional trajectories.

Engaging a variety of institutional actors in ACA reforms

Promoting collective ownership of assessment change processes by involving various institutional actors (eg. leadership, HR, funders, academics, and support staff) in co-design and implementation.

Why is this relevant?

The Leiden Manifesto advises measuring performance against the research missions of the institution, group, or researcher. This encourages recruitment and assessment boards to consider a broader range of domains and activities in their evaluations.

Why is this relevant?

It is recommended that administrative offices lead calibration exercises and train committee members before reviewing candidates, supporting consistent use of rubrics and shared understanding of scoring thresholds and evidence interpretation.

Overview

The Hong Kong Principles (HKPs) were formulated at the 6th World Conference on Research Integrity (June 2019) and published in PLOS Biology in July 2020. They aim to shift research assessment from quantity-based metrics to behaviours that underpin trustworthiness—rigor, transparency, and openness. The framework outlines five principles:

  • Responsible research practices
  • Transparent reporting
  • Open science
  • Valuing a diversity of research types
  • Recognizing broader scholarly contributions
Each principle is accompanied by a rationale and real-world examples where they’ve been adopted.

More info

Overview

INORMS has developed specific tools to challenge and rethink global university rankings by promoting more equitable, transparent, and context-sensitive assessments of universities. These include: - Rating the Rankings: An evaluative system that scores global university rankings based on principles like transparency, good governance, and relevance. - More Than Our Rank: A declaration-based initiative that allows institutions to express their value beyond traditional ranking metrics, encouraging a broader understanding of excellence.

More info

Overview

The article "Good Practices: Career Paths" from Recognition & Rewards Magazine outlines five innovative initiatives by Dutch academic institutions aimed at diversifying and modernizing career trajectories within academia. These practices are part of the broader Recognition & Rewards programme, which seeks to move beyond the traditional emphasis on research performance by acknowledging excellence in teaching, leadership, societal impact, and patient care.

More info

Ensuring that assessment criteria and procedures actively value and accommodate different backgrounds, career paths and life circumstances, reducing biases.

Informing candidates on assessment criteria

Importance of providing candidates with clear and accessible information about what is expected, which criteria are used and how assessments are conducted.

Informing candidates on assessment criteria

Importance of providing candidates with clear and accessible information about what is expected, which criteria are used and how assessments are conducted.

Why is this relevant?

The policy supports, facilitates and promotes the work researchers and organisations do for a greater openness in science and research.

Overview

ResearchComp is a European framework outlining the transversal skills that researchers need for successful and interoperable careers in all sectors of the society (academia, industry, business, public administration, NGOs etc.). It establishes a common language and a common understanding of researchers’ transversal competences in areas such as knowledge creation, communication, leadership, and impact. The framework aims to align research careers with broader labor market needs, helping institutions, funders, and policymakers establish transparent career progression pathways.

More info

Overview

The Curriculum Vitae Abreviado (CVA) ANECA is a standardized, abbreviated curriculum vitae template developed by the Agencia Nacional de EvaluaciĂłn de la Calidad y AcreditaciĂłn (ANECA) for the accreditation of university teaching staff in Spain. Its primary aim is to provide a concise yet comprehensive overview of an academic's career, emphasizing significant contributions and achievements. The CVA ANECA allows researchers to highlight their most impactful work, facilitating a more qualitative assessment beyond traditional metrics.

More info

Ensuring that assessment criteria and procedures actively value and accommodate different backgrounds, career paths and life circumstances, reducing biases.

Raising awareness on the need for reform

Building institutional momentum by engaging academic communities in discussions about the limitations of current systems and the potential benefits of reform.

Why is this relevant?

Overall, the Plan concerns the development of a strategy on career pathways and career advancement for academic and technical staff. Administration and mamagement can refer to these guidelines to assess their progress in terms of criteria adopted in administrative and managerial decision making.

Administration and management

Responsible for operationalising academic assessment processes by implementing procedures and ensuring compliance with policies and regulations. Examples of roles: Deans of faculties/schools; Department chairs/heads; Academic Affairs Office; Human Resources (HR) Department

Overview

The Coalition for Advancing Research Assessment (CoARA) Agreement aims to reform research assessment practices by shifting away from quantitative metrics (e.g., journal impact factors) towards a more qualitative and holistic approach. It promotes recognition of diverse research contributions, including Open Science, societal impact, and team-based research. The agreement is voluntary and seeks commitment from research institutions, funders, and other stakeholders to implement responsible assessment practices.

More info

Why is this relevant?

This resource gives tips on how prioritize equity and transparency of research assessment processes. This can be relevant for different roles, and especially for leadership and governance as those who define the overarching principles, strategy and criteria guiding academic assessment.

Why is this relevant?

The user guide contains the recommendations on the responsible use of the Publication Forum classification system to assist in the evaluation of research output.

Encouraging the need for clarity, consistency, and impartiality in assessment processes to build trust and accountability.

Overview

The Recognising and Rewarding Open Research Maturity Framework and Self-Assessment Tool was developed to support research organisations in evaluating and improving their approaches to recognising and rewarding open research practices. The tool provides a structured self-assessment framework to help institutions understand their current maturity levels and identify areas for enhancement. It promotes cultural and procedural change in research assessment, aligning with principles of transparency, inclusivity, and integrity.

More info

Overview

The article "Good Practices: Career Paths" from Recognition & Rewards Magazine outlines five innovative initiatives by Dutch academic institutions aimed at diversifying and modernizing career trajectories within academia. These practices are part of the broader Recognition & Rewards programme, which seeks to move beyond the traditional emphasis on research performance by acknowledging excellence in teaching, leadership, societal impact, and patient care.

More info

Encouraging institutions to align their assessment frameworks and practices with broader frameworks and principles at national or international levels.

Avoiding the use of inappropriate metrics for assessing academics

Moving away from using journal- and publication-based metrics (e.g. JIF, AIS, h-index) in academic career assessment as a proxy for quality and impact.

Overview

The SCOPE framework is a practical, step-by-step model developed by the International Network of Research Management Societies (INORMS) Research Evaluation Group to guide responsible research evaluation. It aims to bridge the gap between high-level principles (like DORA and the Leiden Manifesto) and their practical implementation in research assessment. SCOPE is a five-stage process:

  1. Start with what you value: Identify and articulate the core values and objectives driving the evaluation.
  2. Context considerations: Understand the specific context, including disciplinary norms and institutional goals.
  3. Options for measuring: Explore various qualitative and quantitative methods suitable for the evaluation.
  4. Probe deeply: Critically assess chosen methods for biases and unintended consequences.
  5. Evaluate your evaluation: Reflect on the evaluation process to ensure it aligns with initial values and objectives.
This framework encourages evaluations that are value-driven, context-sensitive, and methodologically sound.

More info

Overview

The YUFE4Postdocs evaluation and selection procedure is designed to foster a fair, transparent, and competency-based assessment of postdoctoral researchers through a structured CV template and the involvement of non-academic, societal stakeholders in the (final) selection, for what traditionally is an academic procedure. It aligns with the principles of responsible research assessment, emphasizing qualitative over quantitative criteria. The selection process does not rely on quantitative metrics and is fully dependent on peer review.

More info

Why is this relevant?

The rubric guide highlights transparency via calibration exercises, shared definitions, and documented evaluation criteria—ensuring fair comparisons among candidates and reducing subjective bias during committee deliberations.

Why is this relevant?

UNESCO encourages institutions to value open science practices—such as open access, FAIR data, multilingual dissemination, and citizen science—as integral scholarly contributions. Institutional leadership must formally recognise and integrate these into criteria guiding career assessment.

Raising awareness on the need for reform

Building institutional momentum by engaging academic communities in discussions about the limitations of current systems and the potential benefits of reform.

Why is this relevant?

OTM‑R requires organisations to publish clear vacancy information—including criteria, timelines, application stages, and feedback procedures. Administrative offices (HR, Academic Affairs) are responsible for ensuring transparency in these processes.

Overview

Developed by UMC Utrecht, this guide shifts focus from output-based evaluations to formative assessments that prioritize societal impact. It encourages evaluators to consider the purpose and process of research, asking "why are you doing this research?" instead of "what have you measurably produced?"

More info

Academic staff

Responsible for presenting evidence of academic achievements and competencies for assessment, in line with institutional criteria. Examples of roles: Early-career researcher (R1); Recognised researcher (R2); Established researcher (R3); Leading reearcher (R4).

Avoiding the use of inappropriate metrics for assessing academics

Moving away from using journal- and publication-based metrics (e.g. JIF, AIS, h-index) in academic career assessment as a proxy for quality and impact.

Focus on expanding the assessment of academic careers to include a wider range of roles, activities, and contributions. Emphasising meriting multiple, flexible career paths that reflect diverse profiles and professional trajectories.

Overview

The portfolio aims to recognize diverse academic contributions by evaluating candidates across five domains: scientific research, teaching, clinical work (if applicable), innovation and impact, and leadership and collaboration. This approach aligns with the principles of Open Science and the Recognition & Rewards initiative, promoting a more inclusive and comprehensive assessment of academic excellence.

More info

Encouraging the need for clarity, consistency, and impartiality in assessment processes to build trust and accountability.

Overview

Developed by UMC Utrecht, this guide shifts focus from output-based evaluations to formative assessments that prioritize societal impact. It encourages evaluators to consider the purpose and process of research, asking "why are you doing this research?" instead of "what have you measurably produced?"

More info

Overview

A concise, one-page tool developed by the Declaration on Research Assessment (DORA) as part of the Tools to Advance Research Assessment (TARA) project. It aims to broaden the understanding of 'impact' in scholarly work by visualizing it across two dimensions: the scale of a contribution's influence and the diversity of audiences reached. It highlights various academic achievements, including open science practices, contributions to institutional policies (e.g., diversity, equity, and inclusion), societal impacts, and industry collaborations.

More info

Why is this relevant?

Reformscape supports the strategic abandonment of JIF/h-index as default measures in policy decisions.

Why is this relevant?

In this document we share five good practices to reforming academic career paths across a range of different universities, to not only give practical insights, but also to inspire to action.

Help us improve this framework — share your feedback!

Overview

The SCOPE framework is a practical, step-by-step model developed by the International Network of Research Management Societies (INORMS) Research Evaluation Group to guide responsible research evaluation. It aims to bridge the gap between high-level principles (like DORA and the Leiden Manifesto) and their practical implementation in research assessment. SCOPE is a five-stage process:

  1. Start with what you value: Identify and articulate the core values and objectives driving the evaluation.
  2. Context considerations: Understand the specific context, including disciplinary norms and institutional goals.
  3. Options for measuring: Explore various qualitative and quantitative methods suitable for the evaluation.
  4. Probe deeply: Critically assess chosen methods for biases and unintended consequences.
  5. Evaluate your evaluation: Reflect on the evaluation process to ensure it aligns with initial values and objectives.
This framework encourages evaluations that are value-driven, context-sensitive, and methodologically sound.

More info

Broadening the scope of domains and activities considered in ACA processes

Recognising and meriting varied academic activities—including research, teaching, mentoring, leadership, societal engagement, open science and collaboration—as valid and valuable in career assessments.

Rethinking what is valued and measured in academic assessment, moving beyond traditional metrics to more holistic evaluation methods, profiles and professional trajectories.

Ensuring that assessment criteria and procedures actively value and accommodate different backgrounds, career paths and life circumstances, reducing biases.

Overview

Debiasing Committee Composition and Deliberative Processes is a one-page brief developed by DORA that identifies strategies for including more perspectives and reducing biases in the evaluation processes for hiring, promotion, tenure, and funding decisions. This tool includes ideas for building trust through transparency, taking a portfolio view to decision-making, fostering a diversity of opinion that invites all viewpoints, and expanding possibilities beyond historical norms.

More info

Help us improve this framework — share your feedback!

Why is this relevant?

The report recommends engaging a wide range of institutional actors in the reform of research assessment practices. Leadership and governance bodies are key in driving these reforms and fostering broad institutional engagement.

Overview

This resource by DORA aims to help identify and mitigate biases in research assessment. It outlines seven personal biases that can affect hiring, promotion, and tenure decisions, as well as four institutional and infrastructural implications of these biases. It also provides strategies to develop new institutional conditions that reduce bias.

More info

Overview

The OTM-R “Package”—developed by the ERA Steering Group on HR and Mobility—offers a holistic set of tools to help institutions adopt recruitment practices that are open, transparent, and merit-based. Its key components are: - Rationale & Guidelines: Explaining why OTM-R benefits researchers, institutions and the ERA. - Self-Assessment Checklist: About 20 questions spanning policy, advertising, selection, and appointment. - Toolkit: A step-by-step guide from advertising to appointment, including good-practice examples.

More info

Overview

The article "Good In the interview featured in Recognition & Rewards Magazine, Rianne Letschert (President of Maastricht University) and Victor Bekkers (Dean of the Erasmus School of Social and Behavioural Sciences) discuss the imperative for diversifying academic career paths within Dutch universities. They emphasize that the traditional model, which expects academics to excel simultaneously in research, teaching, leadership, and societal impact, is unsustainable and often leads to burnout and attrition. This model disproportionately rewards research output, neglecting other vital contributions such as teaching excellence, leadership, and societal engagement.

More info

Why is this relevant?

Overall, the Plan concerns the development of a strategy on career pathways and career advancement for academic and technical staff. Leaders and Governors can refer to this plan to initiate align ACA related processes to National level initiatives in the UK (Researcher Development Concordat, Technician Commitment).

Overview

A concise, one-page tool developed by the Declaration on Research Assessment (DORA) as part of the Tools to Advance Research Assessment (TARA) project. It aims to broaden the understanding of 'impact' in scholarly work by visualizing it across two dimensions: the scale of a contribution's influence and the diversity of audiences reached. It highlights various academic achievements, including open science practices, contributions to institutional policies (e.g., diversity, equity, and inclusion), societal impacts, and industry collaborations.

More info

Recruitment and assessment boards

Responsible for conducting the academic evaluations, by reviewing applications, and assessing qualifications. Responsible for making recommendations or decisions for appointment or promotion. Examples of roles: Faculty recruitment committees; Promotion and tenure committees; Ethics & Diversity committees; External peer-reviewers.

Help us improve this framework — share your feedback!

Overview

A concise, one-page tool developed by the Declaration on Research Assessment (DORA) as part of the Tools to Advance Research Assessment (TARA) project. It aims to broaden the understanding of 'impact' in scholarly work by visualizing it across two dimensions: the scale of a contribution's influence and the diversity of audiences reached. It highlights various academic achievements, including open science practices, contributions to institutional policies (e.g., diversity, equity, and inclusion), societal impacts, and industry collaborations.

More info

Ensuring transparency and fairness in ACA processes

Clearly defined, openly communicated procedures and criteria that ensure equitable treatment of candidates.

Overview

Rethinking Research Assessment: Ideas for Action is a resource developed by DORA that outlines five common myths about research evaluation, which frequently relies on indicators like Journal Impact Factor (JIF) and similar measures as proxies for quality in research, promotion, and tenure decisions. To counter these myths, the resource offers five design principles to help research intensive institutions experiment with and develop better research assessment practices: 1. Instill standards and structure into research assessment processes. 2. Foster a sense of personal accountability among faculty and staff. 3. Prioritize equity and transparency in research assessment. 4. Take a big picture or portfolio view toward researcher contributions. 5. Refine research assessment processes through iterative feedback.

More info

Ensuring transparency and fairness in ACA processes

Clearly defined, openly communicated procedures and criteria that ensure equitable treatment of candidates.

Overview

The Norwegian Career Assessment Matrix (NOR-CAM) is a framework developed by Universities Norway (UHR) to enhance the recognition and rewards system in academic careers. It aims to provide a more flexible and holistic approach to evaluating academic performance by emphasizing transparency, breadth, and systematic comprehensive assessments, supplementing responsible use of quantitative metrics of scientific publications. NOR-CAM encourages the assessment in five competency areas:, 1) research output, 2) research process, 3) pedagogical competencies, 2) societal impact, 4) academic leadership and 5) other competencies, as well as aligning evaluations with Open Science principles.

More info

Why is this relevant?

The document outlines five common myths about evaluation and provides five design principles to help institutions experiment with and develop better research assessment practices. This can be very useful for leadership and goverance to understand the need for reform, but also people in other roles within research performing institutions.

Why is this relevant?

In this double interview, two leaders - from Maastricht University and Erasmus University Rotterdam - speak about the need for academic career assessment reform, and the practical steps that their organisations have taken to make reform a reality.

Recognising the importance of involving the wide academic community, at all seniority levels, in reflections, design and implementation of academic career assessment reforms.

Help us improve this framework — share your feedback!

Overview

The Publication Forum (JUFO) classification, developed by the Finnish scientific community, evaluates the average quality of academic publication channels—such as journals, book publishers, and conferences—by assigning them to levels 0 to 3, with level 3 indicating the highest quality. Its primary purpose is to support the assessment of research output quality at the institutional level, particularly within Finland's university funding model. The classification emphasizes responsible research evaluation practices, aligning with international standards like DORA, the Leiden Manifesto, and the Metric Tide report.

More info

Academic staff

Responsible for presenting evidence of academic achievements and competencies for assessment, in line with institutional criteria. Examples of roles: Early-career researcher (R1); Recognised researcher (R2); Established researcher (R3); Leading reearcher (R4).

Why is this relevant?

The Dialogue Toolkit offers a rich selection of conversational structures and techniques to help start difficult conversations on carreer assessment reform. It is designed to be used by anyone, but will perhaps be of most use to HR managers or policy staff interested in kickstarting a reform discussion at their institution.

Overview

Debiasing Committee Composition and Deliberative Processes is a one-page brief developed by DORA that identifies strategies for including more perspectives and reducing biases in the evaluation processes for hiring, promotion, tenure, and funding decisions. This tool includes ideas for building trust through transparency, taking a portfolio view to decision-making, fostering a diversity of opinion that invites all viewpoints, and expanding possibilities beyond historical norms.

More info

Overview

Balanced, Broad, Responsible: A Practical Guide for Research Evaluators is a resource that aims to promote a holistic approach to the evaluation of funding proposals by moving beyond traditional quantitative metrics. The tool comprises a concise video and an accompanying one-page brief, offering six practical suggestions (“checklist”) for research funders who are seeking to implement or improve responsible assessment of funding applications.

More info

Why is this relevant?

The report advises against the use of inappropriate metrics, such as journal impact factors, for assessing academic performance. Assessment boards are tasked with ensuring that only appropriate and effective metrics are used in evaluations.

Why is this relevant?

By meriting a broader set of activities, NOR-CAM facilitates flexible, non-linear, and hybrid career trajectories that reflect individual strengths, institutional needs, movement across sectors and evolving academic roles at all stages of career progression.

Why is this relevant?

Institutional leaders can set policies that encourage qualitative assessment methods (e.g., narratives on research integrity) alongside responsible metrics. The HKP promote moving away from over-reliance on citation metrics, which requires leadership-level endorsement.

Overview

The Finnish Guidelines on Research Integrity (RI Guidelines), issued by the Finnish National Board on Research Integrity (TENK) in 2023, aim to promote good research practices and a responsible research culture across all disciplines. They provide a framework for self-regulation within the research community, outlining procedures for handling alleged violations of research integrity. Key updates include shortened investigation timelines, the introduction of Research Integrity Advisers, alignment with international classification of violations, and the inclusion of severity assessments for violations.

More info

Why is this relevant?

The Declaration explicitly promotes multilingualism, and commits to socially relevant, inclusive research aligned with open science. Leadership is critical to institutionalizing inclusive evaluation frameworks that respect linguistic and disciplinary diversity.

Overview

The Manifesto seeks to rectify the misuse of research metrics by promoting evaluations that are transparent, diverse, and reflective of the multifaceted nature of research. Its ten principles emphasize the importance of combining quantitative data with qualitative assessments, tailoring evaluations to specific research missions, and recognizing the value of locally relevant research. The principles also call for openness in data collection and analysis, allowing those evaluated to verify data, accounting for disciplinary differences, and regularly reviewing indicators to mitigate systemic effects.

More info

Balancing the use of qualitative and quantitative indicators

Promoting a more holistic assessment approach, reflecting the breadth and depth of academic contributions, that integrates a qualitative dimension (eg. expert judgment, narratives) with metrics used appropriately and responsibly.

Overview

The Manifesto seeks to rectify the misuse of research metrics by promoting evaluations that are transparent, diverse, and reflective of the multifaceted nature of research. Its ten principles emphasize the importance of combining quantitative data with qualitative assessments, tailoring evaluations to specific research missions, and recognizing the value of locally relevant research. The principles also call for openness in data collection and analysis, allowing those evaluated to verify data, accounting for disciplinary differences, and regularly reviewing indicators to mitigate systemic effects.

More info

Encouraging institutions to align their assessment frameworks and practices with broader frameworks and principles at national or international levels.

Help us improve this framework — share your feedback!

Overview

The HR Excellence in Research (HRS4R) Award is a recognition granted by the European Commission to research institutions committed to improving working conditions for researchers. It supports the implementation of the European Charter for Researchers and the Code of Conduct for the Recruitment of Researchers (Charter & Code), fostering transparent recruitment and professional development opportunities. The tool is voluntary but offers institutions a structured framework to enhance their HR policies in alignment with European best practices.

More info

Why is this relevant?

The document outlines five common myths about evaluation and provides five design principles to help institutions experiment with and develop better research assessment practices. This can be very useful for leadership and goverance to understand the need for reform, but also people in other roles within research performing institutions.

Overview

The Career Framework for University Teaching offers a structured, evidence-based pathway to recognise and reward teaching contributions alongside research. Developed between 2015–2018 by the Royal Academy of Engineering and global partners, it outlines four progressive levels: 1. Effective teacher 2. Skilled & collegial teacher 3. Institutional leader or Scholarly teacher (dual paths) 4. National & global leader For each level, it defines (a) sphere of impact, (b) promotion criteria, and (c) forms of evidence to demonstrate achievement.

More info

Broadening the scope of domains and activities considered in ACA processes

Recognising and meriting varied academic activities—including research, teaching, mentoring, leadership, societal engagement, open science and collaboration—as valid and valuable in career assessments.

Overview

The Manifesto seeks to rectify the misuse of research metrics by promoting evaluations that are transparent, diverse, and reflective of the multifaceted nature of research. Its ten principles emphasize the importance of combining quantitative data with qualitative assessments, tailoring evaluations to specific research missions, and recognizing the value of locally relevant research. The principles also call for openness in data collection and analysis, allowing those evaluated to verify data, accounting for disciplinary differences, and regularly reviewing indicators to mitigate systemic effects.

More info

Help us improve this framework — share your feedback!

Why is this relevant?

The HKP advocate for transparent, fair, and integrity-focused assessment processes. Assessment boards are key actors in ensuring fairness and openness when evaluating candidates against these principles.

Overview

The Hong Kong Principles (HKPs) were formulated at the 6th World Conference on Research Integrity (June 2019) and published in PLOS Biology in July 2020. They aim to shift research assessment from quantity-based metrics to behaviours that underpin trustworthiness—rigor, transparency, and openness. The framework outlines five principles:

  • Responsible research practices
  • Transparent reporting
  • Open science
  • Valuing a diversity of research types
  • Recognizing broader scholarly contributions
Each principle is accompanied by a rationale and real-world examples where they’ve been adopted.

More info

What is your role?

Leadership and governance

Administration and management

Recruitment and assessment boards

Academic staff (as candidates)

Overview

The portfolio aims to recognize diverse academic contributions by evaluating candidates across five domains: scientific research, teaching, clinical work (if applicable), innovation and impact, and leadership and collaboration. This approach aligns with the principles of Open Science and the Recognition & Rewards initiative, promoting a more inclusive and comprehensive assessment of academic excellence.

More info

Why is this relevant?

The document breaks common myths and gives examples on how to promote equity and transparency of research assessment processes.

Leadership and governance

Responsible for defining the overarching principles, strategy and criteria guiding academic assessment. Endorsing final decisions and overseeing assessment outcomes. Examples of roles: Rector / President / Director; Vice-Rector / Vice-President; Deans of faculties/schools; Department chairs/heads

Why is this relevant?

The document advocates for assessments that prioritize substantive expert review—valuing narrative and qualitative evidence alongside quantitative indicators—to more accurately capture research quality. Assessment panels must therefore combine both forms of evidence in decision-making.

Focus on expanding the assessment of academic careers to include a wider range of roles, activities, and contributions. Emphasising meriting multiple, flexible career paths that reflect diverse profiles and professional trajectories.

Why is this relevant?

The document outlines five common myths about evaluation and provides five design principles to help institutions experiment with and develop better research assessment practices. This can be very useful for leadership and goverance to understand the need for reform, but also people in other roles within research performing institutions.

Overview

The HR Excellence in Research (HRS4R) Award is a recognition granted by the European Commission to research institutions committed to improving working conditions for researchers. It supports the implementation of the European Charter for Researchers and the Code of Conduct for the Recruitment of Researchers (Charter & Code), fostering transparent recruitment and professional development opportunities. The tool is voluntary but offers institutions a structured framework to enhance their HR policies in alignment with European best practices.

More info

Why is this relevant?

One-pager tool that aims to broaden the understanding of 'impact' in scholarly work by visualizing it across two dimensions: the scale of a contribution's influence and the diversity of audiences reached. It highlights various academic achievements, including open science practices, contributions to institutional policies (e.g., diversity, equity, and inclusion), societal impacts, and industry collaborations.

Engaging a variety of institutional actors in ACA reforms

Promoting collective ownership of assessment change processes by involving various institutional actors (eg. leadership, HR, funders, academics, and support staff) in co-design and implementation.

Defining new career pathways (taking into account different career stages)

Developing flexible, non-linear, and hybrid career trajectories that reflect individual strengths, institutional needs, movement across sectors and evolving academic roles at all stages of career progression.

Developing new indicators (qualitative and/or quantitative)

Creating and using innovative indicators that capture diverse contributions.

Overview

The portfolio aims to recognize diverse academic contributions by evaluating candidates across five domains: scientific research, teaching, clinical work (if applicable), innovation and impact, and leadership and collaboration. This approach aligns with the principles of Open Science and the Recognition & Rewards initiative, promoting a more inclusive and comprehensive assessment of academic excellence.

More info

Overview

The People and Teams UKRI Action Plan is designed to enhance the UK’s research and innovation system by fostering a positive and inclusive research culture. It focuses on supporting diverse career pathways, improving recognition and reward systems, and strengthening collaboration across disciplines and sectors.

More info

Recruitment and assessment boards

Responsible for conducting the academic evaluations, by reviewing applications, and assessing qualifications. Responsible for making recommendations or decisions for appointment or promotion. Examples of roles: Faculty recruitment committees; Promotion and tenure committees; Ethics & Diversity committees; External peer-reviewers.

Why is this relevant?

The Declaration calls for a transformation away from impact-factor centric assessments. Institutional leaders play a central role in approving policies that explicitly ban JIF‑driven systems and embed socially relevant, qualitative criteria into recruitment and promotion rules.

Why is this relevant?

The one-pager helps to implement narrative CVs in line with DORA principles and aligning it with institutional objectives

Overview

The Recommendation offers the first international framework defining open science, its values, principles and actionable pathways toward fair and equitable scientific openness. It seeks to reduce knowledge and technology gaps and bolster science as a global public good. It emphasizes inclusiveness, transparency, reproducibility, and engagement with broader society.

More info

Why is this relevant?

The Dialogue Toolkit offers a rich selection of conversational structures and techniques to help start difficult conversations on carreer assessment reform. It is designed to be used by anyone, but will perhaps be of most use to HR managers or policy staff interested in kickstarting a reform discussion at their institution.

Overview

The Coalition for Advancing Research Assessment (CoARA) Agreement aims to reform research assessment practices by shifting away from quantitative metrics (e.g., journal impact factors) towards a more qualitative and holistic approach. It promotes recognition of diverse research contributions, including Open Science, societal impact, and team-based research. The agreement is voluntary and seeks commitment from research institutions, funders, and other stakeholders to implement responsible assessment practices.

More info

Avoiding the use of inappropriate metrics for assessing academics

Moving away from using journal- and publication-based metrics (e.g. JIF, AIS, h-index) in academic career assessment as a proxy for quality and impact.

Why is this relevant?

The rubric explicitly embeds criteria and guidance around inclusive teaching, mentoring, and service contributions, with administrative units overseeing equity advisor roles, DEIB statements, and compliance with inclusive evaluation policies.

Overview

The Recognising and Rewarding Open Research Maturity Framework and Self-Assessment Tool was developed to support research organisations in evaluating and improving their approaches to recognising and rewarding open research practices. The tool provides a structured self-assessment framework to help institutions understand their current maturity levels and identify areas for enhancement. It promotes cultural and procedural change in research assessment, aligning with principles of transparency, inclusivity, and integrity.

More info

Developing new indicators (qualitative and/or quantitative)

Creating and using innovative indicators that capture diverse contributions.

Providing training to assessors and assessees

Covers initiatives for ongoing capacity-building for evaluators and candidates to enhance understanding and application of new assessment principles, criteria and practices.

Overview

The OTM-R “Package”—developed by the ERA Steering Group on HR and Mobility—offers a holistic set of tools to help institutions adopt recruitment practices that are open, transparent, and merit-based. Its key components are: - Rationale & Guidelines: Explaining why OTM-R benefits researchers, institutions and the ERA. - Self-Assessment Checklist: About 20 questions spanning policy, advertising, selection, and appointment. - Toolkit: A step-by-step guide from advertising to appointment, including good-practice examples.

More info

Why is this relevant?

The Metric Tide emphasizes the importance of combining qualitative expert judgment with quantitative metrics in research assessment.

Overview

The OTM-R “Package”—developed by the ERA Steering Group on HR and Mobility—offers a holistic set of tools to help institutions adopt recruitment practices that are open, transparent, and merit-based. Its key components are: - Rationale & Guidelines: Explaining why OTM-R benefits researchers, institutions and the ERA. - Self-Assessment Checklist: About 20 questions spanning policy, advertising, selection, and appointment. - Toolkit: A step-by-step guide from advertising to appointment, including good-practice examples.

More info

Why is this relevant?

NOR-CAM prescribes the need for developing new indicators to document results and competencies on academic activities beyond publications. To systematically assess merits within e.g. research process, teaching, innovation and leadership, we need to develop ways to document this that can be acknowledged across fields and national boundaries.

Why is this relevant?

ANECA narrative CV template develops a computational plaftorm to standardize the Curriculum Vitae information of candidates. This platform is completely aligned with the evaluation and accreditation procedures of ANECA.

Overview

The People and Teams UKRI Action Plan is designed to enhance the UK’s research and innovation system by fostering a positive and inclusive research culture. It focuses on supporting diverse career pathways, improving recognition and reward systems, and strengthening collaboration across disciplines and sectors.

More info

Leadership and governance

Responsible for defining the overarching principles, strategy and criteria guiding academic assessment. Endorsing final decisions and overseeing assessment outcomes. Examples of roles: Rector / President / Director; Vice-Rector / Vice-President; Deans of faculties/schools; Department chairs/heads

Overview

Debiasing Committee Composition and Deliberative Processes is a one-page brief developed by DORA that identifies strategies for including more perspectives and reducing biases in the evaluation processes for hiring, promotion, tenure, and funding decisions. This tool includes ideas for building trust through transparency, taking a portfolio view to decision-making, fostering a diversity of opinion that invites all viewpoints, and expanding possibilities beyond historical norms.

More info

Avoiding the use of inappropriate metrics for assessing academics

Moving away from using journal- and publication-based metrics (e.g. JIF, AIS, h-index) in academic career assessment as a proxy for quality and impact.

Overview

UC Berkeley’s rubric aims to standardize and increase transparency, equity, and consistency in faculty searches. It provides a structured template where committees define categories (e.g. Research, Teaching, DEI), assign explicit scoring scales (typically 1–5), and clarify evidence that informs how scores are assigned. A core element is the calibration exercise, where committee members pre-score sample applications, compare outcomes, discuss discrepancies, and recalibrate scoring norms before evaluating the full candidate pool. The framework includes sample categories: Research: Productivity, alignment, plans. Teaching & Mentoring: Experience, interests, mentoring approach. A separate rubric addresses Contributions to Diversity, Equity, and Inclusion (DEI).

More info

Why is this relevant?

Administrative units (e.g. HR, Academic Affairs) are responsible for designing job profiles and promotion frameworks. The framework promotes formal recognition of open science, collaboration, data sharing, and citizen science—administrators must embed these into job descriptions and progression routes.

Overview

The 2023 survey by the Federation of Finnish Learned Societies (TSV) explores researchers’ perspectives on the diversity of career assessment criteria in Finland, aiming to inform the development of the Finnish Career Assessment Matrix (FIN-CAM) tool. This initiative aligns with the broader responsible research assessment agenda, including the CoARA Agreement and builds upon frameworks such as OS-CAM and NOR-CAM.

More info

Informing candidates on assessment criteria

Importance of providing candidates with clear and accessible information about what is expected, which criteria are used and how assessments are conducted.

Overview

Developed by UMC Utrecht, this guide shifts focus from output-based evaluations to formative assessments that prioritize societal impact. It encourages evaluators to consider the purpose and process of research, asking "why are you doing this research?" instead of "what have you measurably produced?"

More info

Overview

The Recognising and Rewarding Open Research Maturity Framework and Self-Assessment Tool was developed to support research organisations in evaluating and improving their approaches to recognising and rewarding open research practices. The tool provides a structured self-assessment framework to help institutions understand their current maturity levels and identify areas for enhancement. It promotes cultural and procedural change in research assessment, aligning with principles of transparency, inclusivity, and integrity.

More info

Why is this relevant?

The Leiden Manifesto call for open, transparent, and simple data collection and analytical processes. Recruitment and assessment boards are responsible for implementing these principles to ensure that evaluation processes are transparent and fair.

Why is this relevant?

The document highlights how organisations should aim for a representative gender balance at all levels of staff, including at supervisory and managerial level.

Overview

The Norwegian Career Assessment Matrix (NOR-CAM) is a framework developed by Universities Norway (UHR) to enhance the recognition and rewards system in academic careers. It aims to provide a more flexible and holistic approach to evaluating academic performance by emphasizing transparency, breadth, and systematic comprehensive assessments, supplementing responsible use of quantitative metrics of scientific publications. NOR-CAM encourages the assessment in five competency areas:, 1) research output, 2) research process, 3) pedagogical competencies, 2) societal impact, 4) academic leadership and 5) other competencies, as well as aligning evaluations with Open Science principles.

More info

Overview

The Initiative advocates that scholarly communication should not be monolingual. It calls for balanced multilingual dissemination to ensure both global excellence and local relevance of research.Core Principles: Support researchers in sharing results beyond academia in varied languages. Safeguard national-language publishing infrastructure and support the transition to Open Access. Integrate language diversity into research evaluation, funding, and metrics.

More info

Overview

The HR Excellence in Research (HRS4R) Award is a recognition granted by the European Commission to research institutions committed to improving working conditions for researchers. It supports the implementation of the European Charter for Researchers and the Code of Conduct for the Recruitment of Researchers (Charter & Code), fostering transparent recruitment and professional development opportunities. The tool is voluntary but offers institutions a structured framework to enhance their HR policies in alignment with European best practices.

More info

Help us improve this framework — share your feedback!

Why is this relevant?

The aim of the template is to provide guidelines for the writer of a CV so that the individual’s merits are presented as comprehensively, truthfully and comparably as possible.

Defining new career pathways (taking into account different career stages)

Developing flexible, non-linear, and hybrid career trajectories that reflect individual strengths, institutional needs, movement across sectors and evolving academic roles at all stages of career progression.

Why is this relevant?

The document breaks common myths and gives examples on how to promote equity and transparency of research assessment processes.

Balancing the use of qualitative and quantitative indicators

Promoting a more holistic assessment approach, reflecting the breadth and depth of academic contributions, that integrates a qualitative dimension (eg. expert judgment, narratives) with metrics used appropriately and responsibly.

Engaging a variety of institutional actors in ACA reforms

Promoting collective ownership of assessment change processes by involving various institutional actors (eg. leadership, HR, funders, academics, and support staff) in co-design and implementation.

Balancing the use of qualitative and quantitative indicators

Promoting a more holistic assessment approach, reflecting the breadth and depth of academic contributions, that integrates a qualitative dimension (eg. expert judgment, narratives) with metrics used appropriately and responsibly.

Overview

Rethinking Research Assessment: Ideas for Action is a resource developed by DORA that outlines five common myths about research evaluation, which frequently relies on indicators like Journal Impact Factor (JIF) and similar measures as proxies for quality in research, promotion, and tenure decisions. To counter these myths, the resource offers five design principles to help research intensive institutions experiment with and develop better research assessment practices: 1. Instill standards and structure into research assessment processes. 2. Foster a sense of personal accountability among faculty and staff. 3. Prioritize equity and transparency in research assessment. 4. Take a big picture or portfolio view toward researcher contributions. 5. Refine research assessment processes through iterative feedback.

More info

Why is this relevant?

As a framework it helps to give an overview of transversal skills possibly to be considered in ACA processes, their categorisation and proficiency levels.

Why is this relevant?

NOR-CAM provides a guide to a comprehensive assessment of results and competencies. Bbibliometric indicators should be used with caution and supplemented with other information when making assessments related to appointments promotions or allocation of resources. This opens for a stronger emphasis on quality, content, academic integrity, creativity and contributions to research and/or society, as well as recognising the specific profile of the individual academic.

Why is this relevant?

ANECA accreditation procedure provides the explanation of the merits and competence to accreditated an academic as a Lecturer and University Profesor in Spain, acoording with some criteria related to research and transfer activities, teaching, leadership and profesional activities.

Why is this relevant?

The framework places strong emphasis on inclusive co-design—engaging diverse stakeholders (academics, HR, funders, leadership). Governance bodies are key in initiating and overseeing such institution-wide engagement and in setting participatory processes for policy change.

Overview

The Manifesto seeks to rectify the misuse of research metrics by promoting evaluations that are transparent, diverse, and reflective of the multifaceted nature of research. Its ten principles emphasize the importance of combining quantitative data with qualitative assessments, tailoring evaluations to specific research missions, and recognizing the value of locally relevant research. The principles also call for openness in data collection and analysis, allowing those evaluated to verify data, accounting for disciplinary differences, and regularly reviewing indicators to mitigate systemic effects.

More info

Why is this relevant?

NOR-CAM provides recommendations for assessing academic quality and excellence through a better balance between quantitative and qualitative information. Bibliometric indicators should be used with caution and supplemented with other information.

Why is this relevant?

This practical guide for evaluators includes 6 steps for fostering a more holistic evaluation process.

Why is this relevant?

The document encourages broader and more diverse reviewer pools, as well as mutual learning and cross-organizational collaboration. Responsibility for shaping these inclusive structures and fostering partnerships lies with institutional leaders and senior governance bodies.

Why is this relevant?

The SPACE rubric can help organisations to support the implementation of fair and responsible academic career assessment practices in two ways: a) to establish a baseline for institutional “maturity” or b) to retroactively analyze how institutional conditions may have impacted the outcomes of a specific intervention (e.g. hiring, promotion). It is a comprehensive tool that contains materials and recommendations on how to be used by several actors.

Overview

Developed by UMC Utrecht, this guide shifts focus from output-based evaluations to formative assessments that prioritize societal impact. It encourages evaluators to consider the purpose and process of research, asking "why are you doing this research?" instead of "what have you measurably produced?"

More info

Overview

The OTM-R “Package”—developed by the ERA Steering Group on HR and Mobility—offers a holistic set of tools to help institutions adopt recruitment practices that are open, transparent, and merit-based. Its key components are: - Rationale & Guidelines: Explaining why OTM-R benefits researchers, institutions and the ERA. - Self-Assessment Checklist: About 20 questions spanning policy, advertising, selection, and appointment. - Toolkit: A step-by-step guide from advertising to appointment, including good-practice examples.

More info

Recruitment and assessment boards

Responsible for conducting the academic evaluations, by reviewing applications, and assessing qualifications. Responsible for making recommendations or decisions for appointment or promotion. Examples of roles: Faculty recruitment committees; Promotion and tenure committees; Ethics & Diversity committees; External peer-reviewers.

Academic staff

Responsible for presenting evidence of academic achievements and competencies for assessment, in line with institutional criteria. Examples of roles: Early-career researcher (R1); Recognised researcher (R2); Established researcher (R3); Leading reearcher (R4).

Balancing the use of qualitative and quantitative indicators

Promoting a more holistic assessment approach, reflecting the breadth and depth of academic contributions, that integrates a qualitative dimension (eg. expert judgment, narratives) with metrics used appropriately and responsibly.

Why is this relevant?

The Statement recommends that organisations offer updated guidelines and training in order to ensure consistent understanding and application of assessment standards. Such capacity building falls under the remit of administrative teams (HR and Academic Affairs) to design, deliver, and maintain these training programmes.

Overview

INORMS has developed specific tools to challenge and rethink global university rankings by promoting more equitable, transparent, and context-sensitive assessments of universities. These include: - Rating the Rankings: An evaluative system that scores global university rankings based on principles like transparency, good governance, and relevance. - More Than Our Rank: A declaration-based initiative that allows institutions to express their value beyond traditional ranking metrics, encouraging a broader understanding of excellence.

More info

Why is this relevant?

This initiative effectively opens traditional academic procedures to non-academic assessors by providing specialized training and a clear, structured process, thereby fostering mutual understanding between conventional and non-traditional evaluators.

Overview

The portfolio aims to recognize diverse academic contributions by evaluating candidates across five domains: scientific research, teaching, clinical work (if applicable), innovation and impact, and leadership and collaboration. This approach aligns with the principles of Open Science and the Recognition & Rewards initiative, promoting a more inclusive and comprehensive assessment of academic excellence.

More info

Why is this relevant?

The Framework supports distinct teaching-focused academic career tracks—parallel to research careers—and allows staff to progress to professorship on the basis of teaching excellence. Institutional leadership is responsible for creating and endorsing these alternative career pathways.

Why is this relevant?

Reformscape provides examples for combining qualitative and quantitative indicators responsibly.

Overview

The Recognising and Rewarding Open Research Maturity Framework and Self-Assessment Tool was developed to support research organisations in evaluating and improving their approaches to recognising and rewarding open research practices. The tool provides a structured self-assessment framework to help institutions understand their current maturity levels and identify areas for enhancement. It promotes cultural and procedural change in research assessment, aligning with principles of transparency, inclusivity, and integrity.

More info

Overview

The Researcher’s Curriculum Vitae (CV) Template, developed by the Finnish National Board on Research Integrity (TENK) in collaboration with Universities Finland UNIFI, the Rectors’ Conference of Finnish Universities of Applied Sciences Arene, and the Academy of Finland, aims to standardize the presentation of researchers' qualifications and achievements. Its primary purpose is to ensure that CVs comprehensively, truthfully, and comparably reflect an individual's studies, professional career, academic merits, and other accomplishments. This standardization promotes responsible conduct of research by providing clear guidelines on how to document and present one's credentials.

More info

Why is this relevant?

OTM‑R emphasizes that recruitment must be open and transparent, with clear criteria and equal treatment of candidates. Selection committees are responsible for implementing these fair and uniform assessment procedures across all applicants.

Ensuring transparency and fairness in ACA processes

Clearly defined, openly communicated procedures and criteria that ensure equitable treatment of candidates.

Administration and management

Responsible for operationalising academic assessment processes by implementing procedures and ensuring compliance with policies and regulations. Examples of roles: Deans of faculties/schools; Department chairs/heads; Academic Affairs Office; Human Resources (HR) Department

Why is this relevant?

All researchers engaged in a research career should be recognised as professionals and be treated accordingly. This should commence at the beginning of their careers, namely at postgraduate level, and should include all levels, regardless of their classification at national level. should establish recruitment procedures which are open, efficient, transparent, supportive and internationally comparable, as well as tailored to the type of positions advertised.

Why is this relevant?

Endorsing the Helsinki Initiative shows institutional alignment with an international movement promoting multilingualism; leadership can formally adopt and integrate its principles into institutional policy and evaluation frameworks.

Disclaimers:

The tools and resources summaries were initially produced using AI (ChatGPT) and subsequently reviewed and validated by WG ACA members and, where possible, the tool's authors. This process is still ongoing, therefore the content of the summaries is still subject to changes. The practical steps and suggestions for implementation included in the summaries were generated using AI (ChatGPT) and are intended as a general guide only. The suggested actions are not prescriptive and should be adapted to the specific institutional, cultural, and operational context. The suggested actions are meant to offer inspiration and highlight key considerations for those looking to implement this tool.

Overview

The YUFE4Postdocs evaluation and selection procedure is designed to foster a fair, transparent, and competency-based assessment of postdoctoral researchers through a structured CV template and the involvement of non-academic, societal stakeholders in the (final) selection, for what traditionally is an academic procedure. It aligns with the principles of responsible research assessment, emphasizing qualitative over quantitative criteria. The selection process does not rely on quantitative metrics and is fully dependent on peer review.

More info

Why is this relevant?

Emloyers should take whole range of experiences of the candidates into account in selection processes. Should recognise it as wholly legitimate, and indeed desirable, that researchers be represented in the relevant information, consultation and decision-making bodies of the institutions for which they work.

Overview

The Curriculum Vitae Abreviado (CVA) ANECA is a standardized, abbreviated curriculum vitae template developed by the Agencia Nacional de EvaluaciĂłn de la Calidad y AcreditaciĂłn (ANECA) for the accreditation of university teaching staff in Spain. Its primary aim is to provide a concise yet comprehensive overview of an academic's career, emphasizing significant contributions and achievements. The CVA ANECA allows researchers to highlight their most impactful work, facilitating a more qualitative assessment beyond traditional metrics.

More info

Help us improve this framework — share your feedback!

Why is this relevant?

Academic staff can use this tool to become aware of the new criteria and initiatives relating their career progression and funding decision making

Overview

The Norwegian Career Assessment Matrix (NOR-CAM) is a framework developed by Universities Norway (UHR) to enhance the recognition and rewards system in academic careers. It aims to provide a more flexible and holistic approach to evaluating academic performance by emphasizing transparency, breadth, and systematic comprehensive assessments, supplementing responsible use of quantitative metrics of scientific publications. NOR-CAM encourages the assessment in five competency areas:, 1) research output, 2) research process, 3) pedagogical competencies, 2) societal impact, 4) academic leadership and 5) other competencies, as well as aligning evaluations with Open Science principles.

More info

Why is this relevant?

The document breaks common myths and gives examples on how to promote equity and transparency of research assessment processes.

Providing training to assessors and assessees

Covers initiatives for ongoing capacity-building for evaluators and candidates to enhance understanding and application of new assessment principles, criteria and practices.

Why is this relevant?

Candidates should be informed, prior to the selection, about the recruitment process and the selection criteria, the number of available positions and the career development prospects. They should also be informed after the selection process about the strengths and weaknesses of their applications.

Overview

The YUFE4Postdocs evaluation and selection procedure is designed to foster a fair, transparent, and competency-based assessment of postdoctoral researchers through a structured CV template and the involvement of non-academic, societal stakeholders in the (final) selection, for what traditionally is an academic procedure. It aligns with the principles of responsible research assessment, emphasizing qualitative over quantitative criteria. The selection process does not rely on quantitative metrics and is fully dependent on peer review.

More info

Overview

Reformscape is an online resource developed by the San Francisco Declaration on Research Assessment (DORA) to support the global academic community in implementing responsible research assessment practices. It offers a comprehensive, searchable collection of policies, action plans, and standards from over 200 academic institutions worldwide, focusing on fairer and more robust criteria for hiring, promotion, and tenure decisions. The tool enables users to explore diverse approaches to academic career assessment, facilitating the adoption of practices that move beyond traditional metrics and foster inclusivity and equity.

More info

Avoiding the use of inappropriate metrics for assessing academics

Moving away from using journal- and publication-based metrics (e.g. JIF, AIS, h-index) in academic career assessment as a proxy for quality and impact.

Overview

Reformscape is an online resource developed by the San Francisco Declaration on Research Assessment (DORA) to support the global academic community in implementing responsible research assessment practices. It offers a comprehensive, searchable collection of policies, action plans, and standards from over 200 academic institutions worldwide, focusing on fairer and more robust criteria for hiring, promotion, and tenure decisions. The tool enables users to explore diverse approaches to academic career assessment, facilitating the adoption of practices that move beyond traditional metrics and foster inclusivity and equity.

More info

Raising awareness on the need for reform

Building institutional momentum by engaging academic communities in discussions about the limitations of current systems and the potential benefits of reform.

Why is this relevant?

The OR4 framework encourages institutions to invest in structured training programs that build capacity around responsible assessment. Senior leadership is accountable for mandating and resourcing this training to ensure consistent, fair, and informed evaluations.

Overview

The CLACSO-FOLEC (Latin American Council of Social Sciences – Latin American Forum for Research Evaluation) initiative proposes a research assessment model that prioritizes social relevance, diversity, and responsible evaluation over traditional bibliometric indicators. The tool aims to reshape evaluation systems by recognizing diverse research outputs, such as local knowledge dissemination, open-access contributions, and public engagement. It advocates for an evaluation framework that aligns research with societal needs and fosters epistemic justice.

More info

Why is this relevant?

The Leiden Manifesto advises measuring performance against the research missions of the institution, group, or researcher. This encourages recruitment and assessment boards to consider a broader range of domains and activities in their evaluations.

Developing new indicators (qualitative and/or quantitative)

Creating and using innovative indicators that capture diverse contributions.

Recruitment and assessment boards

Responsible for conducting the academic evaluations, by reviewing applications, and assessing qualifications. Responsible for making recommendations or decisions for appointment or promotion. Examples of roles: Faculty recruitment committees; Promotion and tenure committees; Ethics & Diversity committees; External peer-reviewers.

Why is this relevant?

This tool offers clear guidance in developing and evaluating assessment critera. Leadership and Governance can employ this to establish processes linked to performance assessment in research.

Why is this relevant?

Within the capabilities described in the SPACE rubric, the use of indicators is included and elaborated across the levels of institutional "maturity". This supports a more holistic approach and can help institutions support better the development and implementation of new academic assessment practices and activities.

Developing new indicators (qualitative and/or quantitative)

Creating and using innovative indicators that capture diverse contributions.

Overview

The ANECA (National Agency for Quality Assessment and Accreditation of Spain) accreditation framework establishes the evaluation criteria for academic career progression in Spain. It provides a structured approach for assessing teaching, research, and institutional contributions for faculty accreditation at different career levels (Lecturer and University Professor). The tool ensures transparency, consistency, and merit-based assessment in academic hiring and promotions. The tool developed evaluation criteria to 4 dimensions for merits and competences: A. Research, transfer and exchange of knowledge, B. Teaching, C. Leadership and D. Professional Activity.

More info

Why is this relevant?

The document outlines five common myths about evaluation and provides five design principles to help institutions experiment with and develop better research assessment practices. This can be very useful for leadership and goverance to understand the need for reform, but also people in other roles within research performing institutions.

Overview

Rethinking Research Assessment: Ideas for Action is a resource developed by DORA that outlines five common myths about research evaluation, which frequently relies on indicators like Journal Impact Factor (JIF) and similar measures as proxies for quality in research, promotion, and tenure decisions. To counter these myths, the resource offers five design principles to help research intensive institutions experiment with and develop better research assessment practices: 1. Instill standards and structure into research assessment processes. 2. Foster a sense of personal accountability among faculty and staff. 3. Prioritize equity and transparency in research assessment. 4. Take a big picture or portfolio view toward researcher contributions. 5. Refine research assessment processes through iterative feedback.

More info

Overview

"Reimagining Academic Career Assessment: Stories of Innovation and Change" is a comprehensive report and online repository developed collaboratively by the San Francisco Declaration on Research Assessment (DORA), the European University Association (EUA), and SPARC Europe. The initiative compiles case studies from various universities and national consortia, aiming to provide inspiration and practical guidance for institutions seeking to reform their academic career assessment practices. The primary objective is to move away from traditional metrics, such as journal impact factors, and adopt more holistic evaluation methods that recognize a broader spectrum of academic activities, including teaching, open science practices, and societal impact.

More info

Leadership and governance

Responsible for defining the overarching principles, strategy and criteria guiding academic assessment. Endorsing final decisions and overseeing assessment outcomes. Examples of roles: Rector / President / Director; Vice-Rector / Vice-President; Deans of faculties/schools; Department chairs/heads

Why is this relevant?

The Framework provides structured promotion criteria and evidence formats that operationalise qualitative and quantitative indicators of teaching effectiveness (e.g. sphere of impact, forms of evidence, peer review). Administrative teams implement these into policy and appraisals.

Informing candidates on assessment criteria

Importance of providing candidates with clear and accessible information about what is expected, which criteria are used and how assessments are conducted.

Overview

A concise, one-page tool developed by the Declaration on Research Assessment (DORA) as part of the Tools to Advance Research Assessment (TARA) project. It aims to broaden the understanding of 'impact' in scholarly work by visualizing it across two dimensions: the scale of a contribution's influence and the diversity of audiences reached. It highlights various academic achievements, including open science practices, contributions to institutional policies (e.g., diversity, equity, and inclusion), societal impacts, and industry collaborations.

More info

Why is this relevant?

CoARA emphasises moving toward peer‐review‐led, qualitative evaluation supported by responsible indicators. Assessment boards are the ones operationalising this balance: integrating narrative, expert judgment, metrics where relevant.

Why is this relevant?

The portfolio moves beyond publication counts to use narrative evaluation, example outputs, and mission‑relevant performance indicators. Assessment boards apply these new qualitative indicators to evaluate candidates.

Overview

The CLACSO-FOLEC (Latin American Council of Social Sciences – Latin American Forum for Research Evaluation) initiative proposes a research assessment model that prioritizes social relevance, diversity, and responsible evaluation over traditional bibliometric indicators. The tool aims to reshape evaluation systems by recognizing diverse research outputs, such as local knowledge dissemination, open-access contributions, and public engagement. It advocates for an evaluation framework that aligns research with societal needs and fosters epistemic justice.

More info

Informing candidates on assessment criteria

Importance of providing candidates with clear and accessible information about what is expected, which criteria are used and how assessments are conducted.

Overview

The Manifesto seeks to rectify the misuse of research metrics by promoting evaluations that are transparent, diverse, and reflective of the multifaceted nature of research. Its ten principles emphasize the importance of combining quantitative data with qualitative assessments, tailoring evaluations to specific research missions, and recognizing the value of locally relevant research. The principles also call for openness in data collection and analysis, allowing those evaluated to verify data, accounting for disciplinary differences, and regularly reviewing indicators to mitigate systemic effects.

More info

Broadening the scope of domains and activities considered in ACA processes

Recognising and meriting varied academic activities—including research, teaching, mentoring, leadership, societal engagement, open science and collaboration—as valid and valuable in career assessments.

Why is this relevant?

The aim of the guideline is to promote good and responsible research practices and to prevent violations of research integrity in all academic disciplines. It is targeted for research organisations, researchers and students in Finnish higher education.

Balancing the use of qualitative and quantitative indicators

Promoting a more holistic assessment approach, reflecting the breadth and depth of academic contributions, that integrates a qualitative dimension (eg. expert judgment, narratives) with metrics used appropriately and responsibly.

Overview

The CLACSO-FOLEC (Latin American Council of Social Sciences – Latin American Forum for Research Evaluation) initiative proposes a research assessment model that prioritizes social relevance, diversity, and responsible evaluation over traditional bibliometric indicators. The tool aims to reshape evaluation systems by recognizing diverse research outputs, such as local knowledge dissemination, open-access contributions, and public engagement. It advocates for an evaluation framework that aligns research with societal needs and fosters epistemic justice.

More info

Overview

The People and Teams UKRI Action Plan is designed to enhance the UK’s research and innovation system by fostering a positive and inclusive research culture. It focuses on supporting diverse career pathways, improving recognition and reward systems, and strengthening collaboration across disciplines and sectors.

More info

Why is this relevant?

By requiring evaluators to assess leadership, stakeholder collaboration, data management, and societal use—as well as scientific products—the Guide ensures panels value a wide spectrum of academic contributions.

Why is this relevant?

The document breaks common myths and gives examples on how to promote equity and transparency of research assessment processes.

Why is this relevant?

Aims to align research careers with broader labour market needs across institutions. Useful for policy development or skill development initiatives.

Rethinking what is valued and measured in academic assessment, moving beyond traditional metrics to more holistic evaluation methods, profiles and professional trajectories.

Administration and management

Responsible for operationalising academic assessment processes by implementing procedures and ensuring compliance with policies and regulations. Examples of roles: Deans of faculties/schools; Department chairs/heads; Academic Affairs Office; Human Resources (HR) Department

Overview

The Career Framework for University Teaching offers a structured, evidence-based pathway to recognise and reward teaching contributions alongside research. Developed between 2015–2018 by the Royal Academy of Engineering and global partners, it outlines four progressive levels: 1. Effective teacher 2. Skilled & collegial teacher 3. Institutional leader or Scholarly teacher (dual paths) 4. National & global leader For each level, it defines (a) sphere of impact, (b) promotion criteria, and (c) forms of evidence to demonstrate achievement.

More info

Overview

Reformscape is an online resource developed by the San Francisco Declaration on Research Assessment (DORA) to support the global academic community in implementing responsible research assessment practices. It offers a comprehensive, searchable collection of policies, action plans, and standards from over 200 academic institutions worldwide, focusing on fairer and more robust criteria for hiring, promotion, and tenure decisions. The tool enables users to explore diverse approaches to academic career assessment, facilitating the adoption of practices that move beyond traditional metrics and foster inclusivity and equity.

More info

Developing new indicators (qualitative and/or quantitative)

Creating and using innovative indicators that capture diverse contributions.

Overview

The YUFE4Postdocs evaluation and selection procedure is designed to foster a fair, transparent, and competency-based assessment of postdoctoral researchers through a structured CV template and the involvement of non-academic, societal stakeholders in the (final) selection, for what traditionally is an academic procedure. It aligns with the principles of responsible research assessment, emphasizing qualitative over quantitative criteria. The selection process does not rely on quantitative metrics and is fully dependent on peer review.

More info

Encouraging institutions to align their assessment frameworks and practices with broader frameworks and principles at national or international levels.

Why is this relevant?

By requiring evaluators to assess leadership, stakeholder collaboration, data management, and societal use—as well as scientific products—the Guide ensures panels value a wide spectrum of academic contributions.

Ensuring transparency and fairness in ACA processes

Clearly defined, openly communicated procedures and criteria that ensure equitable treatment of candidates.

Overview

The People and Teams UKRI Action Plan is designed to enhance the UK’s research and innovation system by fostering a positive and inclusive research culture. It focuses on supporting diverse career pathways, improving recognition and reward systems, and strengthening collaboration across disciplines and sectors.

More info

Defining new career pathways (taking into account different career stages)

Developing flexible, non-linear, and hybrid career trajectories that reflect individual strengths, institutional needs, movement across sectors and evolving academic roles at all stages of career progression.

Informing candidates on assessment criteria

Importance of providing candidates with clear and accessible information about what is expected, which criteria are used and how assessments are conducted.

Informing candidates on assessment criteria

Importance of providing candidates with clear and accessible information about what is expected, which criteria are used and how assessments are conducted.

Overview

The Publication Forum (JUFO) classification, developed by the Finnish scientific community, evaluates the average quality of academic publication channels—such as journals, book publishers, and conferences—by assigning them to levels 0 to 3, with level 3 indicating the highest quality. Its primary purpose is to support the assessment of research output quality at the institutional level, particularly within Finland's university funding model. The classification emphasizes responsible research evaluation practices, aligning with international standards like DORA, the Leiden Manifesto, and the Metric Tide report.

More info

Overview

The Norwegian Career Assessment Matrix (NOR-CAM) is a framework developed by Universities Norway (UHR) to enhance the recognition and rewards system in academic careers. It aims to provide a more flexible and holistic approach to evaluating academic performance by emphasizing transparency, breadth, and systematic comprehensive assessments, supplementing responsible use of quantitative metrics of scientific publications. NOR-CAM encourages the assessment in five competency areas:, 1) research output, 2) research process, 3) pedagogical competencies, 2) societal impact, 4) academic leadership and 5) other competencies, as well as aligning evaluations with Open Science principles.

More info

Avoiding the use of inappropriate metrics for assessing academics

Moving away from using journal- and publication-based metrics (e.g. JIF, AIS, h-index) in academic career assessment as a proxy for quality and impact.

Overview

Rethinking Research Assessment: Ideas for Action is a resource developed by DORA that outlines five common myths about research evaluation, which frequently relies on indicators like Journal Impact Factor (JIF) and similar measures as proxies for quality in research, promotion, and tenure decisions. To counter these myths, the resource offers five design principles to help research intensive institutions experiment with and develop better research assessment practices: 1. Instill standards and structure into research assessment processes. 2. Foster a sense of personal accountability among faculty and staff. 3. Prioritize equity and transparency in research assessment. 4. Take a big picture or portfolio view toward researcher contributions. 5. Refine research assessment processes through iterative feedback.

More info

Recognising the importance of involving the wide academic community, at all seniority levels, in reflections, design and implementation of academic career assessment reforms.

Overview

The Finnish 'Policy for Open Scholarship' provides a national framework to support open science practices across research institutions. Its aim is to embed openness into the research culture, fostering transparency, reproducibility, and societal impact. Key pillars include open access to research publications, data, methods, and education. The policy encourages systemic change in research evaluation, rewarding openness in academic careers.

More info

Broadening the scope of domains and activities considered in ACA processes

Recognising and meriting varied academic activities—including research, teaching, mentoring, leadership, societal engagement, open science and collaboration—as valid and valuable in career assessments.

Overview

ResearchComp is a European framework outlining the transversal skills that researchers need for successful and interoperable careers in all sectors of the society (academia, industry, business, public administration, NGOs etc.). It establishes a common language and a common understanding of researchers’ transversal competences in areas such as knowledge creation, communication, leadership, and impact. The framework aims to align research careers with broader labor market needs, helping institutions, funders, and policymakers establish transparent career progression pathways.

More info

Overview

The Coalition for Advancing Research Assessment (CoARA) Agreement aims to reform research assessment practices by shifting away from quantitative metrics (e.g., journal impact factors) towards a more qualitative and holistic approach. It promotes recognition of diverse research contributions, including Open Science, societal impact, and team-based research. The agreement is voluntary and seeks commitment from research institutions, funders, and other stakeholders to implement responsible assessment practices.

More info

Why is this relevant?

A concise, one-page tool can serve academics to define career goals.

Overview

The document builds on a 2019 study by Science Europe and Technopolis Group, examining how research organisations conduct assessment for funding and career progression. Its purpose is to strengthen existing processes—ensuring they are effective, efficient, fair, transparent, and aligned with evolving research practices such as open science and AI. The Statement issues seven core recommendations covering: Transparency; Evaluating robustness; Bias mitigation; Cost/efficiency; Reviewer diversity; Qualitative assessment; and Novel approaches.

More info

Leadership and governance

Responsible for defining the overarching principles, strategy and criteria guiding academic assessment. Endorsing final decisions and overseeing assessment outcomes. Examples of roles: Rector / President / Director; Vice-Rector / Vice-President; Deans of faculties/schools; Department chairs/heads

Overview

Reformscape is an online resource developed by the San Francisco Declaration on Research Assessment (DORA) to support the global academic community in implementing responsible research assessment practices. It offers a comprehensive, searchable collection of policies, action plans, and standards from over 200 academic institutions worldwide, focusing on fairer and more robust criteria for hiring, promotion, and tenure decisions. The tool enables users to explore diverse approaches to academic career assessment, facilitating the adoption of practices that move beyond traditional metrics and foster inclusivity and equity.

More info

Recruitment and assessment boards

Responsible for conducting the academic evaluations, by reviewing applications, and assessing qualifications. Responsible for making recommendations or decisions for appointment or promotion. Examples of roles: Faculty recruitment committees; Promotion and tenure committees; Ethics & Diversity committees; External peer-reviewers.

Overview

The People and Teams UKRI Action Plan is designed to enhance the UK’s research and innovation system by fostering a positive and inclusive research culture. It focuses on supporting diverse career pathways, improving recognition and reward systems, and strengthening collaboration across disciplines and sectors.

More info

Why is this relevant?

Overall, the Plan concerns the development of a strategy on career pathways and career advancement for academic and technical staff. Administration and management can refer to these guidelines to assess their progress in terms of criteria adopted in administrative and managerial decision making.

Why is this relevant?

NOR-CAM provides a structured framework to broaden academic assessments by recognising and meriting varied academic activities—including research, teaching, mentoring, leadership, societal engagement, open science and collaboration.

Overview

Balanced, Broad, Responsible: A Practical Guide for Research Evaluators is a resource that aims to promote a holistic approach to the evaluation of funding proposals by moving beyond traditional quantitative metrics. The tool comprises a concise video and an accompanying one-page brief, offering six practical suggestions (“checklist”) for research funders who are seeking to implement or improve responsible assessment of funding applications.

More info

Help us improve this framework — share your feedback!

Broadening the scope of domains and activities considered in ACA processes

Recognising and meriting varied academic activities—including research, teaching, mentoring, leadership, societal engagement, open science and collaboration—as valid and valuable in career assessments.

Help us improve this framework — share your feedback!

Leadership and governance

Responsible for defining the overarching principles, strategy and criteria guiding academic assessment. Endorsing final decisions and overseeing assessment outcomes. Examples of roles: Rector / President / Director; Vice-Rector / Vice-President; Deans of faculties/schools; Department chairs/heads

Why is this relevant?

By meriting a broader set of activities, NOR-CAM facilitates flexible, non-linear, and hybrid career trajectories that reflect individual strengths, institutional needs, movement across sectors and evolving academic roles at all stages of career progression.

Why is this relevant?

The onepager gives the guidance for optimizing, evaluating, and iterating on the use of narrative CVs to ensure comprehensive recognition of diverse research contributions and to create fairer assessment mechanisms.

Overview

The 2023 survey by the Federation of Finnish Learned Societies (TSV) explores researchers’ perspectives on the diversity of career assessment criteria in Finland, aiming to inform the development of the Finnish Career Assessment Matrix (FIN-CAM) tool. This initiative aligns with the broader responsible research assessment agenda, including the CoARA Agreement and builds upon frameworks such as OS-CAM and NOR-CAM.

More info

Why is this relevant?

In this document we share five good practices to reforming academic career paths across a range of different universities, to not only give practical insight in the steps to take, but also to inspire to action.

Overview

The Career Framework for University Teaching offers a structured, evidence-based pathway to recognise and reward teaching contributions alongside research. Developed between 2015–2018 by the Royal Academy of Engineering and global partners, it outlines four progressive levels: 1. Effective teacher 2. Skilled & collegial teacher 3. Institutional leader or Scholarly teacher (dual paths) 4. National & global leader For each level, it defines (a) sphere of impact, (b) promotion criteria, and (c) forms of evidence to demonstrate achievement.

More info

Why is this relevant?

Best Practice example to ensure equality, diversity and inclusiveness in ACA processes by involving non-academic, societal stakeholders in the (final) selection, for what traditionally is an academic procedure. Implementation requires institutional (leadership) commitment and support.

Defining new career pathways (taking into account different career stages)

Developing flexible, non-linear, and hybrid career trajectories that reflect individual strengths, institutional needs, movement across sectors and evolving academic roles at all stages of career progression.

Why is this relevant?

CoARA encourages institutions to review and develop new assessment criteria, tools, and processes. Leaders and governance structures are the ones who initiate and guide this innovation strategically—choosing which new indicators to adopt, resourcing development, and embedding them in institutional frameworks.

Defining new career pathways (taking into account different career stages)

Developing flexible, non-linear, and hybrid career trajectories that reflect individual strengths, institutional needs, movement across sectors and evolving academic roles at all stages of career progression.

Overview

The OTM-R “Package”—developed by the ERA Steering Group on HR and Mobility—offers a holistic set of tools to help institutions adopt recruitment practices that are open, transparent, and merit-based. Its key components are: - Rationale & Guidelines: Explaining why OTM-R benefits researchers, institutions and the ERA. - Self-Assessment Checklist: About 20 questions spanning policy, advertising, selection, and appointment. - Toolkit: A step-by-step guide from advertising to appointment, including good-practice examples.

More info

Balancing the use of qualitative and quantitative indicators

Promoting a more holistic assessment approach, reflecting the breadth and depth of academic contributions, that integrates a qualitative dimension (eg. expert judgment, narratives) with metrics used appropriately and responsibly.

Encouraging institutions to align their assessment frameworks and practices with broader frameworks and principles at national or international levels.

Selection of tools and resources

The tools and resources included in this framework have been drawn from:
  • The case studies developed by the CoARA Working Group on Reforming Academic Career Assessment (WG ACA)
  • The Agreement on Reforming Academic Career Assessment (see Annex 4)

We will continue to expand the collection based on feedback from: CoARA National Chapters, other CoARA Working Groups, institutions and organisations that test the framework.

Overview

The Initiative advocates that scholarly communication should not be monolingual. It calls for balanced multilingual dissemination to ensure both global excellence and local relevance of research.Core Principles: Support researchers in sharing results beyond academia in varied languages. Safeguard national-language publishing infrastructure and support the transition to Open Access. Integrate language diversity into research evaluation, funding, and metrics.

More info

Academic staff

Responsible for presenting evidence of academic achievements and competencies for assessment, in line with institutional criteria. Examples of roles: Early-career researcher (R1); Recognised researcher (R2); Established researcher (R3); Leading reearcher (R4).

Focus on expanding the assessment of academic careers to include a wider range of roles, activities, and contributions. Emphasising meriting multiple, flexible career paths that reflect diverse profiles and professional trajectories.

Why is this relevant?

The Dialogue Toolkit offers a rich selection of conversational structures and techniques to help start difficult conversations on carreer assessment reform. It is designed to be used by anyone, but will perhaps be of most use to HR managers or policy staff interested in kickstarting a reform discussion at their institution.

Recognising the importance of involving the wide academic community, at all seniority levels, in reflections, design and implementation of academic career assessment reforms.

Overview

The Recommendation offers the first international framework defining open science, its values, principles and actionable pathways toward fair and equitable scientific openness. It seeks to reduce knowledge and technology gaps and bolster science as a global public good. It emphasizes inclusiveness, transparency, reproducibility, and engagement with broader society.

More info

Why is this relevant?

This document identifies strategies for including more perspectives and reducing biases in the evaluation processes for hiring, promotion, tenure, and funding decisions.

Overview

The article "Good In the interview featured in Recognition & Rewards Magazine, Rianne Letschert (President of Maastricht University) and Victor Bekkers (Dean of the Erasmus School of Social and Behavioural Sciences) discuss the imperative for diversifying academic career paths within Dutch universities. They emphasize that the traditional model, which expects academics to excel simultaneously in research, teaching, leadership, and societal impact, is unsustainable and often leads to burnout and attrition. This model disproportionately rewards research output, neglecting other vital contributions such as teaching excellence, leadership, and societal engagement.

More info

Raising awareness on the need for reform

Building institutional momentum by engaging academic communities in discussions about the limitations of current systems and the potential benefits of reform.

Ensuring transparency and fairness in ACA processes

Clearly defined, openly communicated procedures and criteria that ensure equitable treatment of candidates.

Avoiding the use of inappropriate metrics for assessing academics

Moving away from using journal- and publication-based metrics (e.g. JIF, AIS, h-index) in academic career assessment as a proxy for quality and impact.

Engaging a variety of institutional actors in ACA reforms

Promoting collective ownership of assessment change processes by involving various institutional actors (eg. leadership, HR, funders, academics, and support staff) in co-design and implementation.

Developing new indicators (qualitative and/or quantitative)

Creating and using innovative indicators that capture diverse contributions.

Administration and management

Responsible for operationalising academic assessment processes by implementing procedures and ensuring compliance with policies and regulations. Examples of roles: Deans of faculties/schools; Department chairs/heads; Academic Affairs Office; Human Resources (HR) Department

Why is this relevant?

The Initiative calls for research assessment systems to treat all publication languages equitably, avoiding bias based on language. Recruitment and assessment boards should ensure fair and neutral evaluation of multidisciplinary and multilingual outputs.

Overview

The SCOPE framework is a practical, step-by-step model developed by the International Network of Research Management Societies (INORMS) Research Evaluation Group to guide responsible research evaluation. It aims to bridge the gap between high-level principles (like DORA and the Leiden Manifesto) and their practical implementation in research assessment. SCOPE is a five-stage process:

  1. Start with what you value: Identify and articulate the core values and objectives driving the evaluation.
  2. Context considerations: Understand the specific context, including disciplinary norms and institutional goals.
  3. Options for measuring: Explore various qualitative and quantitative methods suitable for the evaluation.
  4. Probe deeply: Critically assess chosen methods for biases and unintended consequences.
  5. Evaluate your evaluation: Reflect on the evaluation process to ensure it aligns with initial values and objectives.
This framework encourages evaluations that are value-driven, context-sensitive, and methodologically sound.

More info

Overview

The HR Excellence in Research (HRS4R) Award is a recognition granted by the European Commission to research institutions committed to improving working conditions for researchers. It supports the implementation of the European Charter for Researchers and the Code of Conduct for the Recruitment of Researchers (Charter & Code), fostering transparent recruitment and professional development opportunities. The tool is voluntary but offers institutions a structured framework to enhance their HR policies in alignment with European best practices.

More info

Why is this relevant?

UMC Utrecht’s portfolio explicitly requires candidates to present evidence across five mission‑aligned domains (scientific, teaching, clinical, innovation & impact, leadership & collaboration), operationalized through administrative teams (HR, Academic Affairs) that design and oversee the portfolio templates, guidance, and submission processes.

Providing training to assessors and assessees

Covers initiatives for ongoing capacity-building for evaluators and candidates to enhance understanding and application of new assessment principles, criteria and practices.

Broadening the scope of domains and activities considered in ACA processes

Recognising and meriting varied academic activities—including research, teaching, mentoring, leadership, societal engagement, open science and collaboration—as valid and valuable in career assessments.

Why is this relevant?

The Leiden Manifesto advises against using inappropriate metrics and advocate for regular scrutiny of indicators. Recruitment and assessment boards are tasked with ensuring that only appropriate and effective metrics are used in evaluations.

Overview

The Norwegian Career Assessment Matrix (NOR-CAM) is a framework developed by Universities Norway (UHR) to enhance the recognition and rewards system in academic careers. It aims to provide a more flexible and holistic approach to evaluating academic performance by emphasizing transparency, breadth, and systematic comprehensive assessments, supplementing responsible use of quantitative metrics of scientific publications. NOR-CAM encourages the assessment in five competency areas:, 1) research output, 2) research process, 3) pedagogical competencies, 2) societal impact, 4) academic leadership and 5) other competencies, as well as aligning evaluations with Open Science principles.

More info

Broadening the scope of domains and activities considered in ACA processes

Recognising and meriting varied academic activities—including research, teaching, mentoring, leadership, societal engagement, open science and collaboration—as valid and valuable in career assessments.

Why is this relevant?

Academic staff must actively document and narrate contributions across five mission-aligned domains—scientific research, teaching, clinical practice, innovation & impact, and leadership—thereby directly operationalising the goal of broadening valued academic activities.

Focus on expanding the assessment of academic careers to include a wider range of roles, activities, and contributions. Emphasising meriting multiple, flexible career paths that reflect diverse profiles and professional trajectories.

Why is this relevant?

The CoARA Agreement explicitly calls for ending the innapropriate uses of JIF and h-index in assessment. The Agreement, together with the signatories Action Plans, provide leaders with documented practices to guide reform.

Overview

The article "Good Practices: Career Paths" from Recognition & Rewards Magazine outlines five innovative initiatives by Dutch academic institutions aimed at diversifying and modernizing career trajectories within academia. These practices are part of the broader Recognition & Rewards programme, which seeks to move beyond the traditional emphasis on research performance by acknowledging excellence in teaching, leadership, societal impact, and patient care.

More info

Ensuring that assessment criteria and procedures actively value and accommodate different backgrounds, career paths and life circumstances, reducing biases.

Help us improve this framework — share your feedback!

Overview

Developed by UMC Utrecht, this guide shifts focus from output-based evaluations to formative assessments that prioritize societal impact. It encourages evaluators to consider the purpose and process of research, asking "why are you doing this research?" instead of "what have you measurably produced?"

More info

What do you want to do?

Diversify and recognise career pathways

Align organisational ACA processes with national or international initiatives

Integrate equality, diversity, inclusiveness in ACA processes

Review assessment criteria

Promote transparency and fairness

Foster engagement of the academic community in ACA reforms

Overview

The Publication Forum (JUFO) classification, developed by the Finnish scientific community, evaluates the average quality of academic publication channels—such as journals, book publishers, and conferences—by assigning them to levels 0 to 3, with level 3 indicating the highest quality. Its primary purpose is to support the assessment of research output quality at the institutional level, particularly within Finland's university funding model. The classification emphasizes responsible research evaluation practices, aligning with international standards like DORA, the Leiden Manifesto, and the Metric Tide report.

More info

Providing training to assessors and assessees

Covers initiatives for ongoing capacity-building for evaluators and candidates to enhance understanding and application of new assessment principles, criteria and practices.

Why is this relevant?

The Helsinki Initiative proposes that academic evaluations must value research outputs published in any language, treating local-language publications as legitimately meritorious. Assessment boards, should consider merits beyond English-language outputs when evaluating candidates.

Why is this relevant?

Recognizing activities like data sharing, mentoring, and reproducibility aligns with the HKP's principle of valuing diverse contributions. Administrative teams operationalize these broader criteria in institutional procedures.

Why is this relevant?

This tool offers clear guidance in developing and evaluating assessment critera. Academic staff can use it to assess their progress and to create a transparent record

Overview

The Career Framework for University Teaching offers a structured, evidence-based pathway to recognise and reward teaching contributions alongside research. Developed between 2015–2018 by the Royal Academy of Engineering and global partners, it outlines four progressive levels: 1. Effective teacher 2. Skilled & collegial teacher 3. Institutional leader or Scholarly teacher (dual paths) 4. National & global leader For each level, it defines (a) sphere of impact, (b) promotion criteria, and (c) forms of evidence to demonstrate achievement.

More info

Overview

The Metric Tide (HEFCE, July 2015), led by Prof. James Wilsdon, provides a comprehensive, evidence-based evaluation of how quantitative metrics—like citation counts, altmetrics, journal impact factors—are used and misused in research assessment and management. It balances peer review with metrics, exploring historical trends, disciplinary differences, and the unintended consequences of metric-driven cultures, such as “gaming” behaviours and impacts on equality, diversity, and interdisciplinarity.

More info

Why is this relevant?

The purpose of the tool is to assist research organisations in the self-evaluation and development of services and making them available. Measures promoting the openness of evaluation, learning, research data and publishing are made concrete with minimum and ideal criteria. The tool contains a checklist for responsible evaluation of researchers.

Why is this relevant?

This initiative effectively opens traditional academic procedures to non-academic assessors by providing specialized training and a clear, structured process, thereby fostering mutual understanding between conventional and non-traditional evaluators.

Overview

The Coalition for Advancing Research Assessment (CoARA) Agreement aims to reform research assessment practices by shifting away from quantitative metrics (e.g., journal impact factors) towards a more qualitative and holistic approach. It promotes recognition of diverse research contributions, including Open Science, societal impact, and team-based research. The agreement is voluntary and seeks commitment from research institutions, funders, and other stakeholders to implement responsible assessment practices.

More info

Overview

The Finnish Guidelines on Research Integrity (RI Guidelines), issued by the Finnish National Board on Research Integrity (TENK) in 2023, aim to promote good research practices and a responsible research culture across all disciplines. They provide a framework for self-regulation within the research community, outlining procedures for handling alleged violations of research integrity. Key updates include shortened investigation timelines, the introduction of Research Integrity Advisers, alignment with international classification of violations, and the inclusion of severity assessments for violations.

More info

Broadening the scope of domains and activities considered in ACA processes

Recognising and meriting varied academic activities—including research, teaching, mentoring, leadership, societal engagement, open science and collaboration—as valid and valuable in career assessments.

Overview

The HR Excellence in Research (HRS4R) Award is a recognition granted by the European Commission to research institutions committed to improving working conditions for researchers. It supports the implementation of the European Charter for Researchers and the Code of Conduct for the Recruitment of Researchers (Charter & Code), fostering transparent recruitment and professional development opportunities. The tool is voluntary but offers institutions a structured framework to enhance their HR policies in alignment with European best practices.

More info

Overview

The Agencia Nacional de EvaluaciĂłn de la Calidad y AcreditaciĂłn (ANECA) introduced a revised Code of Ethics on November 22, 2023, to enhance the integrity and transparency of the public servants that are involve of the evaluation and accreditation processes. This updated code consolidates previous ethical guidelines into a unified framework, emphasizing principles such as legality, objectivity, independence, transparency, and professional ethics. It aims to ensure that all personnel and collaborators adhere to high ethical standards, fostering trust in ANECA's operations.

More info

Why is this relevant?

UC Berkeley’s rubric framework mandates numerical scoring alongside qualitative discussion (e.g. research quality, alignment, teaching philosophy, mentoring), with calibrated consensus-building to ensure consistent interpretation—placing the responsibility on assessment panels to integrate both forms of evidence effectively.

Overview

DORA's 'Ideas for Optimization' provides guidance for optimizing, evaluating, and iterating on the use of narrative CVs to ensure comprehensive recognition of diverse research contributions. It focuses on practical implementation and continuous improvement to create fairer assessment mechanisms.

More info

Recruitment and assessment boards

Responsible for conducting the academic evaluations, by reviewing applications, and assessing qualifications. Responsible for making recommendations or decisions for appointment or promotion. Examples of roles: Faculty recruitment committees; Promotion and tenure committees; Ethics & Diversity committees; External peer-reviewers.

Overview

The document builds on a 2019 study by Science Europe and Technopolis Group, examining how research organisations conduct assessment for funding and career progression. Its purpose is to strengthen existing processes—ensuring they are effective, efficient, fair, transparent, and aligned with evolving research practices such as open science and AI. The Statement issues seven core recommendations covering: Transparency; Evaluating robustness; Bias mitigation; Cost/efficiency; Reviewer diversity; Qualitative assessment; and Novel approaches.

More info

Academic staff

Responsible for presenting evidence of academic achievements and competencies for assessment, in line with institutional criteria. Examples of roles: Early-career researcher (R1); Recognised researcher (R2); Established researcher (R3); Leading reearcher (R4).

Why is this relevant?

Overall, the Plan concerns the development of a strategy on career pathways and career advancement for academic and technical staff. Leaders and Governors can refer to this plan to initiate a discussion of areas in need of reform within the institution.

Why is this relevant?

The Recommendation urges responsible bodies to develop policy frameworks and indicators that capture open science contributions—such as open data sharing, public engagement, and reproducibility. Administrative teams are responsible for operationalizing these into actionable assessment tools and processes

Overview

The People and Teams UKRI Action Plan is designed to enhance the UK’s research and innovation system by fostering a positive and inclusive research culture. It focuses on supporting diverse career pathways, improving recognition and reward systems, and strengthening collaboration across disciplines and sectors.

More info

Overview

Rethinking Research Assessment: Ideas for Action is a resource developed by DORA that outlines five common myths about research evaluation, which frequently relies on indicators like Journal Impact Factor (JIF) and similar measures as proxies for quality in research, promotion, and tenure decisions. To counter these myths, the resource offers five design principles to help research intensive institutions experiment with and develop better research assessment practices: 1. Instill standards and structure into research assessment processes. 2. Foster a sense of personal accountability among faculty and staff. 3. Prioritize equity and transparency in research assessment. 4. Take a big picture or portfolio view toward researcher contributions. 5. Refine research assessment processes through iterative feedback.

More info

Practical, adaptable methods or instruments used to directly evaluate, measure, or improve specific aspects of an academic career. Designed for action and change in assessment practices.

Tools

Resources

Informative materials that provide background, context, or guidance on academic career assessment. Support reflection and help inform the use or development of tools.

Examples of tools:

  • NOR-CAM – A toolbox for recognition and rewards in academic careers
  • ANECA narrative CV template
  • Self-evaluation tool for culture of open scholarlship services (Finland)
  • DORA Ideas for Optimization: Five Things to Consider for Narrative CVs
  • Recognition and rewards - Dialogue Toolkit
  • EU ReserchComp
  • Recognising and rewarding open research maturity framework

Examples of resources:

  • CoARA - Agreement on reforming research assessment
  • CLACSO-FOLEC Declaration of principles
  • INORMS - Tools for rethinking global university rankings
  • DORA Reformscape
  • Rewards and recognition -Interview development of career paths
  • Researchers' views on diversity of career assessment criteria in Finland: a survey report

&

Go to resources

Go to tools

Help us improve this framework — share your feedback!

Why is this relevant?

Overall, the Plan concerns the development of a strategy on career pathways and career advancement for academic and technical staff. Administration and mamagement can refer to these guidelines to assess their progress in terms of criteria adopted in administrative and managerial decision making.

Overview

The Recommendation offers the first international framework defining open science, its values, principles and actionable pathways toward fair and equitable scientific openness. It seeks to reduce knowledge and technology gaps and bolster science as a global public good. It emphasizes inclusiveness, transparency, reproducibility, and engagement with broader society.

More info

Defining new career pathways (taking into account different career stages)

Developing flexible, non-linear, and hybrid career trajectories that reflect individual strengths, institutional needs, movement across sectors and evolving academic roles at all stages of career progression.

Why is this relevant?

The Guide provides structure–process–outcome categories with bespoke indicators (e.g., stakeholder involvement, infrastructure, open data) that evaluators should apply in reviews.

Overview

UC Berkeley’s rubric aims to standardize and increase transparency, equity, and consistency in faculty searches. It provides a structured template where committees define categories (e.g. Research, Teaching, DEI), assign explicit scoring scales (typically 1–5), and clarify evidence that informs how scores are assigned. A core element is the calibration exercise, where committee members pre-score sample applications, compare outcomes, discuss discrepancies, and recalibrate scoring norms before evaluating the full candidate pool. The framework includes sample categories: Research: Productivity, alignment, plans. Teaching & Mentoring: Experience, interests, mentoring approach. A separate rubric addresses Contributions to Diversity, Equity, and Inclusion (DEI).

More info

Leadership and governance

Responsible for defining the overarching principles, strategy and criteria guiding academic assessment. Endorsing final decisions and overseeing assessment outcomes. Examples of roles: Rector / President / Director; Vice-Rector / Vice-President; Deans of faculties/schools; Department chairs/heads

Overview

The Metric Tide (HEFCE, July 2015), led by Prof. James Wilsdon, provides a comprehensive, evidence-based evaluation of how quantitative metrics—like citation counts, altmetrics, journal impact factors—are used and misused in research assessment and management. It balances peer review with metrics, exploring historical trends, disciplinary differences, and the unintended consequences of metric-driven cultures, such as “gaming” behaviours and impacts on equality, diversity, and interdisciplinarity.

More info

Defining new career pathways (taking into account different career stages)

Developing flexible, non-linear, and hybrid career trajectories that reflect individual strengths, institutional needs, movement across sectors and evolving academic roles at all stages of career progression.

Overview

The Norwegian Career Assessment Matrix (NOR-CAM) is a framework developed by Universities Norway (UHR) to enhance the recognition and rewards system in academic careers. It aims to provide a more flexible and holistic approach to evaluating academic performance by emphasizing transparency, breadth, and systematic comprehensive assessments, supplementing responsible use of quantitative metrics of scientific publications. NOR-CAM encourages the assessment in five competency areas:, 1) research output, 2) research process, 3) pedagogical competencies, 2) societal impact, 4) academic leadership and 5) other competencies, as well as aligning evaluations with Open Science principles.

More info

Overview

The Hong Kong Principles (HKPs) were formulated at the 6th World Conference on Research Integrity (June 2019) and published in PLOS Biology in July 2020. They aim to shift research assessment from quantity-based metrics to behaviours that underpin trustworthiness—rigor, transparency, and openness. The framework outlines five principles:

  • Responsible research practices
  • Transparent reporting
  • Open science
  • Valuing a diversity of research types
  • Recognizing broader scholarly contributions
Each principle is accompanied by a rationale and real-world examples where they’ve been adopted.

More info

Why is this relevant?

Adoption of the Framework involves institutional training—through guidance documents, workshops, and appraisal support—to help academics and evaluators apply the teaching career criteria consistently. HR or Academic Affairs often lead these capacity-building efforts.

Providing training to assessors and assessees

Covers initiatives for ongoing capacity-building for evaluators and candidates to enhance understanding and application of new assessment principles, criteria and practices.

Why is this relevant?

The SPACE rubric can help organizations to ensure that equality, diversity and inclusieveness are integrated in institutional ACA processes.

Overview

The Hong Kong Principles (HKPs) were formulated at the 6th World Conference on Research Integrity (June 2019) and published in PLOS Biology in July 2020. They aim to shift research assessment from quantity-based metrics to behaviours that underpin trustworthiness—rigor, transparency, and openness. The framework outlines five principles:

  • Responsible research practices
  • Transparent reporting
  • Open science
  • Valuing a diversity of research types
  • Recognizing broader scholarly contributions
Each principle is accompanied by a rationale and real-world examples where they’ve been adopted.

More info

Overview

DORA's 'Ideas for Optimization' provides guidance for optimizing, evaluating, and iterating on the use of narrative CVs to ensure comprehensive recognition of diverse research contributions. It focuses on practical implementation and continuous improvement to create fairer assessment mechanisms.

More info

Developing new indicators (qualitative and/or quantitative)

Creating and using innovative indicators that capture diverse contributions.

Academic staff

Responsible for presenting evidence of academic achievements and competencies for assessment, in line with institutional criteria. Examples of roles: Early-career researcher (R1); Recognised researcher (R2); Established researcher (R3); Leading reearcher (R4).