OPUS News

GraspOS Webinar to Explore OPUS Framework for Reforming Researcher Assessment
GraspOS Webinar to Explore OPUS Framework for Reforming Researcher Assessment 1024 682 Open and Universal Science (OPUS) Project

The 6th GraspOS webinar series will host a session on reforming researcher assessment practices on Wednesday, 29 October 2025, from 15:30 to 16:30 CET. This online event will focus on how the OPUS project is revolutionising the way universities and research funders evaluate researchers and reward Open Science contributions.

Traditional research assessment has long concentrated on publications in high-impact factor journals and bibliometric indicators, overlooking the diverse range of activities researchers perform in their daily work. Moreover, the recent shift towards Open Science has been largely ignored in assessment processes. The OPUS (Open and Universal Science) project addresses these shortcomings by developing a comprehensive Researcher Assessment Framework (RAF) that offers universities and research funders a suite of indicators to transform how they assess researchers and incentivise Open Science practices.

Framework Structure and Approach

The OPUS Researcher Assessment Framework encompasses four key domains of researcher activities: research, education, leadership, and valorisation. The research domain covers proposal writing, methods development, data management, software development, publications, research materials creation, and peer review. The education domain includes course development, educational resource creation, teaching, student supervision, and professional skills development. Leadership focuses on personnel and project management, organisational roles, and recognition for expert contributions. The valorisation domain encompasses science communication, academic and intersectoral collaboration, and entrepreneurship.

The RAF provides organisations with flexibility to select indicators that align with their strategic interests and needs. It divides indicators into processes, outputs, and outcomes, recognising that researchers may be at different stages of activities during assessment. The framework encourages a combination of qualitative and responsible quantitative approaches to evaluation.

Expert Presentation

The webinar will feature Gareth O’Neill from Technopolis Group and the OPUS project as the main speaker. O’Neill serves as Principal Consultant on Open Science at Technopolis Group Belgium, where his work focuses on implementing Open Science across Europe, with particular emphasis on sustainable careers for researchers, research assessment reform, and the European Open Science Cloud development.

Zenia Xenou from OpenAIRE will moderate the session.

Context and Significance

The GraspOS project, a Horizon Europe project, aims to transform research assessment by embedding Open Science principles into evaluation practices. It develops a federated infrastructure providing tools, services, and guidance to support responsible research assessment across various levels, from individual researchers to national systems.

The OPUS project has been testing its framework through pilot implementations at multiple institutions, tailoring assessment models to different organisational contexts. The framework aligns with broader European policy initiatives, including the Coalition for Advancing Research Assessment (COARA) and the ERA Policy Agenda 2025-2027.

Participants will learn how organisations can use the OPUS Researcher Assessment Framework to reform researcher evaluation at their institutions and incentivise Open Science practices. The virtual event will be hosted via Zoom with support from OpenAIRE.

Open-Science-Literature
A Review of Open Science Literature – Mapping the Academic Landscape for Transformational Change
A Review of Open Science Literature – Mapping the Academic Landscape for Transformational Change 1024 706 Open and Universal Science (OPUS) Project

The advancement of open science represents one of the most significant transformations in contemporary research culture, yet the path towards meaningful implementation remains fraught with complex challenges. The Open Universal Science (OPUS) project recognised that understanding the existing academic discourse was fundamental to developing effective interventions and metrics for open science adoption. Through its Work Package 1, the project undertook a systematic review of open science literature, building upon initial landscaping efforts to capture the evolving academic conversation around this critical domain.

Literature Review identifed emerging trends, persistent challenges, and new evidence across five critical dimensions of open science implementation

The OPUS literature review, documented in Deliverable 1.4, served as a foundation for the project’s broader mission to reform research assessment systems and incentivise open science practices. This academic analysis was designed not merely as an intellectual exercise, but as a strategic tool to inform the development of practical interventions and measurable indicators for open science implementation.

The primary purpose of this literature review was to update and expand upon the initial landscaping conducted at the project’s outset, systematically examining academic publications released since 2023 to identify emerging trends, persistent challenges, and new evidence across five critical dimensions of open science implementation. The review was structured to provide direct input to the project’s subsequent work packages, particularly the identification of interventions for open science rewards and incentives (WP2) and the development of metrics and indicators (WP3).

A rigorous methodological approach maintained consistency with the project’s earlier landscaping efforts whilst adapting to the reduced volume of recent literature. The team employed the SCOPUS database as the primary search tool, complemented by Google Scholar searches and analysis of grey literature to ensure appropriate coverage. This systematic approach enabled the project to trace the evolution of academic thinking across five interconnected domains: research assessment and incentives, career precarity, gender equality, industry practices, and trust in science.

The review’s significance extended beyond mere documentation. By maintaining a consistent methodological framework throughout the project’s duration, the team was able to identify not only what new knowledge had emerged, but also where critical gaps persisted in the academic understanding of open science implementation. This gap analysis proved instrumental in shaping the project’s practical outputs and highlighting areas requiring further research attention.

1. Incentives and Rewards for Open Science and Research Assessment

The examination of incentives and rewards for open science, intrinsically linked to the broader question of research assessment reform, represented the most robust area of academic discourse identified in the literature review. This focus reflected the central importance of assessment systems in driving researcher behaviour and the growing recognition that traditional metrics-based approaches were insufficient for capturing the full value of open science practices.

The research team conducted a systematic search of academic literature published since 2023, employing carefully constructed search terms that combined concepts of research careers, incentives, rewards, and assessment with open science terminology. The search strategy utilised multiple combinations of terms to ensure in-depth coverage, including variations such as “research* career*”, “incentiv*”, “reward*”, and “research* assess*” paired with “Open Science” and “Open Research”.

This methodological approach yielded ten new articles, of which seven provided directly relevant contributions to the research question examining whether literature published since the first OPUS landscaping provided further input regarding existing incentives and rewards for encouraging open science adoption. The team supplemented the SCOPUS database search with Google Scholar investigations and reference list analysis to ensure no significant publications were overlooked.

Key Findings

The updated literature review confirmed and expanded upon the project’s initial findings regarding the urgent need for research assessment reform. The academic discourse consistently demonstrated that current assessment systems remain overly focused on traditional bibliometric indicators, particularly peer-reviewed publications and citations in high-impact journals. This persistent emphasis on quantitative measures continued to undervalue the transparency, reproducibility, and collaborative elements central to open science practices.

Particularly significant was a global study involving 230 researchers that revealed striking insights into the research community’s preferences for open science recognition. The study found that researchers favoured the inclusion of open science indicators in research evaluation and career progression processes above all other potential incentives. This was followed by tangible funding incentives specifically earmarked for open science activities. Remarkably, the study indicated that researchers preferred qualitative assessments supplemented by quantitative measures, rather than purely metric-driven approaches.

The literature also revealed concerning conceptual challenges, with one 2024 article warning against uncritical incentivisation of open research within existing frameworks of academic capitalism. This analysis suggested that without significant systemic reform, open science initiatives might be appropriated as tools of institutional control, potentially increasing bureaucratic burden and workload for researchers without delivering meaningful change.

International perspectives enriched the analysis, with studies from Ukraine, Korea, and South Africa all confirming the global nature of assessment system inadequacies. Korean research identified traditional evaluation systems and insufficient funder incentives as the most serious barriers to open science implementation, whilst South African analysis called for fundamental realignment of values, policies, and assessment processes to support open access and open science advancement.

Literature Related to Rewards, Incentives, and Research Assessment

The following publications formed the evidence base for this analysis:

  • Mfengu, A. (2025) – “Socially Just Research Impact Assessment as a Foundation for the Advancement of Open Access and Open Science: A Case of a Research-Intensive African University” – This South African case study explored how higher education institutions could advance open science through equitable performance evaluation, emphasising the need to realign institutional values and assessment processes.
  • Feenstra, R.A., Carretero García, C. & Gómez Nicolau, E. (2025) – “Perception of Research Misconduct in a Spanish University” – While not explicitly focused on open science, this empirical study provided insights into how evaluation system pressures contribute to research misconduct, highlighting the role of bibliometric indicators and publication pressure.
  • Liu, Z., Wang, X. T., Wang, Z., Yan, W., & Hu, M. (2025) – “Registered reports in psychology across scholarly citations and public dissemination: A comparative metaevaluation of more than a decade of practice” – This analysis examined how registered reports, a key open science practice, affected academic recognition and public attention, revealing concerning citation disadvantages that could deter researcher participation.
  • Grattarola F. et al. (2024) – “Gaps between Open Science activities and actual recognition systems: Insights from an international survey” – This global survey of 230 researchers provided empirical evidence about researcher preferences for open science recognition, revealing strong support for policy-based incentives over purely financial rewards.
  • Thomas J. Hostler (2024) – “Open Research Reforms and the Capitalist University: Areas of Opposition and Alignment” – Using academic capitalism theory, this article provided a critical analysis of potential socio-political consequences of open research reforms, warning against implementation within existing neoliberal frameworks.
  • Izarova, I., Bediukh, O., Hartman, Y., & Baklazhenko, Y. (2024) – “From Quantity to Quality: Evaluating Scientific Work of Tertiary School Teachers and Researchers: (Case Study of European Countries and Ukraine)” – This comparative analysis examined modern approaches to researcher evaluation, advocating for qualitative indicators as alternatives to traditional quantitative metrics.
  • Kullmann, S., Weimer, V. (2024) – “Teaching as part of open scholarship: developing a scientometric framework for Open Educational Resources” – This innovative work proposed frameworks for recognising teaching contributions through openly available educational materials, addressing the frequent undervaluation of teaching in academic assessment.
  • Malgorzata Lagisz et al. (2024) – “‘Best Paper’ awards lack transparency, inclusivity, and support for Open Science” – This analysis critically examined academic award systems, demonstrating how current recognition mechanisms fail to encourage transparency and equitable access to academic recognition.
  • Mabile, L. et al. (2024) – “Recommendations on Open Science Rewards and Incentives: Guidance for multiple stakeholders in Research” – Developed by the RDA-SHARC interest group, this publication provided recommendations for implementing open science reward schemes at multiple organisational levels.
  • Shmagun H. et al. (2024) – “Survey-Based Evaluation of Open Science Factors in Korea’s R&D Ecosystem” – This Korean case study identified traditional research evaluation systems and inadequate funder policies as primary barriers to open science implementation.
  • Shmagun, H. et al. (2023) – “Identifying key factors and actions: Initial steps in the Open Science Policy Design and Implementation Process” – This foundational work outlined systematic approaches to open science policy development and implementation processes.

2. Precarity of Research Careers and Open Science: The Missing Connection

The exploration of relationships between research career precarity and open science adoption represented one of the most challenging aspects of the literature review, revealing significant gaps in academic understanding of these interconnected issues. Despite the theoretical importance of this relationship, the review uncovered remarkably limited empirical evidence addressing whether open science practices influence career security or whether precarious employment conditions affect open science adoption.

The research team employed the same systematic methodology used in the original OPUS landscaping to ensure full coherence and comparability of results. Search combinations included variations of “precarity”, “precarious”, and “precariousness” paired with research-related terms and combined with open science terminology. The team also consulted with the SECURE project (Sustainable Careers for Researcher Empowerment), a sister initiative focused specifically on research careers and researcher empowerment.

The search strategy proved particularly challenging, yielding only three articles with the 2023 publication cutoff date. Upon detailed analysis, none of these publications demonstrated direct or indirect focus on the relationship between precarity and open science, highlighting the persistent gap in academic attention to this intersection.

Key Findings

The updated literature review unfortunately confirmed the initial OPUS finding that academic discourse lacks substantive evidence addressing the bidirectional relationship between career precarity and open science adoption. This absence of evidence proved particularly significant given the theoretical importance of the relationship and its potential implications for open science policy development.

The SECURE project’s literature review, whilst providing valuable context on researcher precarity more broadly, did not specifically address open science dimensions due to different project focuses. However, their work confirmed the persistent nature of academic career insecurity across EU countries, documenting how many PhD candidates lack proper employment contracts and social protection, and how early-career researchers frequently move between fixed-term positions without clear prospects for permanent employment.

The limited available literature suggested several factors contributing to research career precarity, including high intrinsic motivation among researchers, extended training periods, insufficient research funding, poor economic conditions, problematic career structures within higher education systems, competitive environments, and mismatches between researcher supply and institutional demand. Notably, open science practices were not explicitly mentioned among these contributing factors.

One significant finding emerged from the examination of responsibility dynamics in open science implementation. A 2025 study explored how researchers frame ethical considerations when implementing open science policies, identifying internal, social/societal, and collective responsibility orientations. However, this work did not extend to questions of career security or precarity.

The literature review identified a potential indirect connection through the broader discussion of academic culture and assessment systems. Current academic priorities that favour traditional research outputs over open science principles of transparency and reproducibility could theoretically disadvantage researchers who invest time in open science practices when competing for secure positions or tenure track appointments. However, no studies provided empirical evidence to support or refute this hypothesis.

Literature Related to Precarity of Research Careers and Open Science

The following publications informed this analysis:

  • Erika Lilja (2025) – “Responsibility dynamics in open science: exploring researchers’ framing of ethical considerations in policy implementation” – This study examined how researchers conceptualise responsibility when implementing open science policies, identifying different ethical orientations but not addressing career security implications.
  • Janger, Jürgen, et al. (2025) – “SECURE, Deliverable 5.3: Policy Brief Promoting SECURE Research Career Framework” – This policy brief from the SECURE project documented persistent challenges with precarious research careers across EU countries, confirming widespread employment insecurity issues.
  • O’Keefe T., Courtois A. (2024) – “Holding Ourselves to Account: The Precarity Dividend and the Ethics of Researching Academic Precarity” – This methodological reflection examined ethical considerations in researching academic precarity without specifically addressing open science dimensions.
  • Siler K. (2024) – “Gerontocracy, Labor Market Bottlenecks, and Generational Crises in Modern Science” – This analysis of generational dynamics in academic careers provided context for understanding career progression challenges but did not connect these issues to open science practices.
  • Lebhardt, Fleur, et al. (2023) – “SECURE Deliverable 1.1: State-of-the-Art on Research Career Frameworks” – This review of research career frameworks provided foundational understanding of career progression models without specific open science focus.
  • Day, Emma, et al. (2023) – “SECURE Deliverable 1.2: State-of-the-Art on Tenure Track-Like Models” – This analysis of tenure track models across different contexts informed understanding of career security mechanisms.
  • Veronika P., Katalin T. (2023) – “‘Why Can’t Researchers Sign a Contract of Indefinite Duration?’ Career Insecurity of Female Researchers in STEM Fields” – This study examined gendered dimensions of career precarity in STEM fields, adding intersectional analysis to precarity discussions.
  • Jürgen Janger, et al. (2022) – “Precarious Careers in Research. Analysis and Policy Options” –  WIFO Studies report mapped employment contracts and career models to understand which researcher groups face the greatest career insecurity.

3. Gender Equality and Open Science: Complex Intersections and Persistent Questions

The examination of relationships between gender equality and open science revealed nuanced and context-dependent connections that resist simple categorisation. This area of inquiry demanded careful attention to intersectional factors, cultural variations, and the complex ways in which open science initiatives might either advance or inadvertently perpetuate existing inequalities within research systems.

The literature review employed a search strategy that encompassed multiple dimensions of gender and equality terminology. Search combinations included variations of “gender”, “sex”, “gender equality”, “gendered”, “engendered”, “women/men”, “male/female”, and “masculine/feminine” paired with research, academic, and scientific terms. This approach aimed to capture the full spectrum of gender-related discourse within open science literature.

The search yielded eleven articles, though detailed analysis revealed that only one demonstrated direct relevance to open science questions. The team supplemented this limited SCOPUS result through Google Scholar investigations and drew upon extensive work conducted during the OPUS project, particularly the inclusion of gender as a horizontal intervention across all pilot sites in Work Package 4.

Key Findings

The updated literature review confirmed the original OPUS finding that whilst open science and gender equality are clearly interrelated, the nature of this relationship remains complex, contextual, and often non-causal. The evidence base continued to resist simple conclusions about whether open science practices predictably advance gender equality or whether gender equality initiatives necessarily promote open science adoption.

A significant 2024 Colombian study involving 2,331 researchers (61% men, 39% women) provided valuable empirical insights into researcher perceptions of open science and gender equity relationships. The majority of respondents recognised that openness in scientific processes would significantly contribute to equity, and the study found no significant gender differences in evaluating open science importance or utilisation across research stages. However, women respondents cited lack of training as the greatest obstacle to adopting open science practices, suggesting that capacity-building approaches must consider gendered barriers to participation.

The literature confirmed the continuing relevance of sociocultural context in shaping gender-open science relationships. As noted in the Colombian study, “sociocultural context and economic and personal circumstances cloud the picture”, making it difficult to isolate the relationship between gender and open science from other contributing or obstructing factors. This finding aligned with OPUS pilot organisation experiences, which revealed that discussions of gender and open science inevitably encompassed broader issues including work-life balance, training and capacity building, awareness of unconscious bias, and development of intersectional policies.

Feminist research perspectives provided critical insights into potential tensions within open science movements. A 2023 analysis examining knowledge equity and open science from feminist viewpoints raised important concerns about whether feminist and qualitative research methods might be undermined by open science movements that prioritise quantitative methods and data reproducibility. The authors highlighted particular barriers facing feminist early-career researchers, including academic precarity and male-dominated community structures.

Despite these concerns, the feminist analysis identified a growing number of early-career feminist researchers who actively claim participation in the open science movement, viewing it as providing “openness, ideology, and impact for changes in academia and science”. This suggests that whilst tensions exist, there are also opportunities for feminist researchers to shape open science development in more inclusive directions.

The OPUS project’s practical work with pilot organisations reinforced the literature’s emphasis on intersectionality and context-sensitivity. Pilot partners highlighted the need for more granular, disaggregated data collection on gender and open science practices, extending beyond binary gender categories to include dimensions such as researcher position, race, and geographic location. Such data would enable more precise identification of intersectional inequalities and support the development of targeted interventions to address bias in research assessment and monitoring.

Literature Related to Gender Equality and Open Science

The following publications informed this analysis:

  • Vallejo Sierra RH (2024) – “Open Science and Gender Equity: What Colombian Researchers Say” – This empirical study surveyed 2,331 Colombian researchers to examine perceptions of relationships between open science and gender equity, providing valuable quantitative insights into researcher attitudes and identifying training as a key barrier for women.
  • Ferguson, J., Littman, R., Christensen, G. et al. (2023) – “Survey of open science practices and attitudes in the social sciences” – This survey explicitly excluded gender from analysis, with authors stating they lacked evidence that gender would be a major predictor of open science involvement, highlighting gaps in research assumptions.
  • Bertram, Michael G. et al. (2023) – “Open Science” – This Current Biology article acknowledged that barriers to open science are interconnected and can depend on researcher career stage, stability, gender, and ethnic identity, though without developing these concepts in detail.
  • Kruschick, F.; Schoch, K. (2023) – “Knowledge equity and open science: An attempt to outline the field from a feminist research perspective” – This theoretical analysis examined intersectional feminist research principles in relation to open science, raising concerns about potential undermining of feminist methods while advocating for feminist engagement with open science movements.

4. Industry Practices and Open Science: Bridging Academic and Commercial Worlds

The exploration of industry practices in relation to open science revealed a complex landscape where traditional academic openness intersects with commercial imperatives, intellectual property concerns, and competitive dynamics. This domain required careful attention to the distinctions between open science and open innovation, whilst examining how industry engagement with openness varies significantly across sectors, organisational contexts, and strategic objectives.

The literature review employed search strategies that encompassed both “Open Science” and “Open Innovation” terminology, recognising that whilst open science is predominantly associated with academia, its principles are increasingly explored in industry, particularly in research-intensive sectors such as pharmaceuticals, biotechnology, and technology. The team used multiple search combinations pairing openness concepts with industry-related terms including “industry”, “business”, “SMEs”, “enterprise”, “corporate”, “firm”, and “commerce”.

The SCOPUS search yielded five articles, of which four provided relevant content. The team supplemented this with Google Scholar investigations and drew extensively upon OPUS project work, including common interventions implemented across pilot organisations and analysis conducted by project partners working directly with industry networks.

Key Findings

The updated literature review revealed an evolving understanding of how industry engagement with open science and open innovation is developing, characterised by selective adoption, strategic implementation, and careful balance between openness and competitive protection. The analysis confirmed that industry adoption of open practices remains largely project-specific or experimental, often managed by particular departments or teams rather than representing organisation-wide commitments.

A 2023 study examining university-business collaboration under open science conditions provided important insights into the transformation of traditional knowledge transfer processes. The research identified intellectual property management and goal alignment between academic and commercial partners as persistent challenges, whilst highlighting the role of government policy in facilitating or hindering effective collaboration. The study emphasised that resolving process-related problems around knowledge exchange and intellectual rights could significantly improve trust levels and facilitate more effective knowledge sharing.

Innovative approaches to open innovation emerged from 2025 research examining data sharing in large, research-intensive firms. This work proposed that sharing unstructured data could represent a new pathway for competitive advantage through open innovation, challenging traditional assumptions about data as purely proprietary assets. The study developed a three-stage business model to support data sharing across different phases of open innovation, though the authors acknowledged limitations in terms of sample size and sector-specificity.

The concept of technology centres as intermediary organisations gained attention as a potential model for facilitating industry-academia collaboration. A 2024 Spanish analysis explored how technology centres could serve as cooperative partners within the triple helix model of university-industry-government interaction, suggesting that such intermediaries might be crucial for scaling open science practices in business contexts.

The OPUS project’s practical work with industry networks revealed several key patterns in industry engagement with open science and open innovation. Sectors with high research intensity, those addressing global challenges, and industries subject to regulatory encouragement showed the greatest adoption of open practices. Companies such as AstraZeneca, Novartis, and Johnson & Johnson in pharmaceuticals, Google DeepMind, Microsoft, and IBM in technology, and Enel, Siemens, and Ørsted in energy emerged as leaders in transparency, data sharing, and open collaboration.

The analysis identified both opportunities and challenges for industry engagement with open science. Opportunities included access to external expertise and resources, faster breakthrough innovations, improved market fit, risk reduction, talent attraction and retention, enhanced brand reputation, first-mover advantages, and knowledge transfer benefits. Challenges encompassed knowledge barriers related to intellectual property protection, data sensitivity and security concerns, collaboration barriers including trust issues and cultural differences, organisational resistance to change, financial and strategic implementation barriers, and difficulties in measuring return on investment.

Literature Related to Industry Practices and Open Science

The following publications informed this analysis:

Nikolai Kazantsev, et al. (2025) – “Leveraging Unstructured Data Sharing in Open Innovation: A Business Model for Large Research-Intensive Firms” – This innovative study examined how large firms could gain competitive advantage through data sharing in open innovation contexts, proposing new business models for balancing openness with commercial objectives.

María Núñez-Romero, et al. (2024) – “Cooperación en innovación con centros tecnológicos. Enfoque teórico” – This Spanish-language analysis explored innovation cooperation with technology centres, examining how intermediary organisations could facilitate university-industry collaboration within triple helix models.

Lukianenko DH, et al. (2023) – “University Competitiveness in the Knowledge Economy: A Kohonen Map Approach” – This conference paper examined factors of university competitiveness, proposing that open science, open innovation, and open education should guide innovation management within universities.

Zhanna V. Gornostaeva, et al. (2023) – “The Model of the Behaviour of Market Players in the Collaboration of Universities and Business Structures for the Support of the Digital Economy” – This analysis compared traditional and open science approaches to university-business collaboration, identifying key challenges around intellectual property and goal alignment.

Gao, Y., and M. Janssen (2023) – “The Open Data Canvas–Analysing Value Creation From Open Data” – This work provided frameworks for analysing value creation through open data initiatives, offering practical tools for business implementation.

Widder, David Gray, et al. (2023) – “Open (For Business): Big Tech, Concentrated Power, and the Political Economy of Open AI” – This critical analysis examined the complexity of open AI landscapes, warning about potential corporate co-optation of openness rhetoric whilst highlighting both opportunities and challenges for democratic access to AI systems.

5. Trust and Open Science: Building Confidence in Scientific Enterprise

The examination of trust in relation to open science revealed both the fundamental importance of trust for scientific credibility and the complex ways in which openness initiatives might enhance or challenge existing trust relationships. This analysis required attention to multiple dimensions of trust: public trust in science and scientists, trust among researchers and institutions, and trust in the processes and infrastructure supporting open science implementation.

The literature review employed targeted search strategies focusing on explicit connections between trust concepts and open science terminology. Search combinations included “open research” and “open science” paired with “trust in science”, “mistrust in science”, and “conflict of interest” terms. The team also examined broader combinations of “research” and “science” with trust-related terminology to capture relevant discourse that might not explicitly use open science language.

The SCOPUS search yielded nine articles, though only three provided content directly relevant to OPUS objectives, with others focusing primarily on trust and vaccines rather than broader open science questions. The team supplemented this with Google Scholar investigations and conducted searches through open-access book resources to access additional relevant literature.

Key Findings

The updated literature review provided valuable new evidence supporting the hypothesis that public trust can be enhanced through scientific openness, whilst continuing to reveal limited research on trust dynamics among scientists in relation to open science practices. The evidence base suggested that whilst theoretical connections between trust and openness are intuitive, empirical research demonstrating these relationships remains limited.

A significant 2024 international assessment examining trust in scientists across 68 countries provided insights into public perceptions of scientific trustworthiness. The study found that whilst most people in most countries trusted scientists and supported greater scientist engagement in society and policymaking, public perceptions of scientific openness were comparatively lower than perceptions of competence, benevolence, and integrity. This finding suggested that scientists seeking to enhance public trust should focus particularly on transparency regarding funding sources, data availability, and communication practices.

Multiple studies supported the proposition that open science practices could address trust challenges across different domains. A 2023 analysis of open science for the 21st century found that open practices “have the potential to address many of the problems of inequity, inaccuracy, and misconduct that plague research, as well as to build public trust”. Similarly, 2024 public health research demonstrated that science transparency and community engagement could address dual challenges of misinformation and eroded public trust, crucial for effective public health responses.

Disciplinary applications provided concrete examples of how open science might enhance trust relationships. A 2024 language testing study argued that open science could support trust by providing society with evidence from peer-reviewed validation studies demonstrating that test scores are useful, accurate, and fair. The analysis suggested that open science could foster equity by making research on language learning and testing available to all researchers regardless of ability to pay.

Critical analysis of academic publishing systems revealed how current practices might undermine trust relationships. A 2025 study examining peer review processes argued that confidentiality associated with traditional peer review creates “an opaque smoke-screen that prevents direct scrutiny of peer reviewers or editorial handling, and may be a contributing factor to the current crisis of mistrust in science and academic publishing”. The authors advocated for more open and transparent peer review processes as mechanisms for rebuilding trust.

Science communication emerged as a significant domain for understanding trust-openness relationships. 2025 analysis of science communication and trust argued that “amid ongoing global challenges such as health pandemics, climate change and environmental degradation, a nuanced discussion of the relational and contextual dynamics of trust has arguably never been more critical”. The work emphasised that trust must be “continually earned and sustained through credible, ethical and responsive communication practices”.

The OPUS project’s practical exploration of trust concepts through pilot organisation interventions revealed important insights into researcher perceptions of open science trustworthiness. The interventions confirmed concerns about open data protocols identified in the literature, but also revealed unexpected concerns about institutional open science policy frameworks, which researchers perceived as “too abstract, too top-down, and too often disconnected from the reality of scientific activity at the researcher level”.

Literature Related to Trust and Open Science

The following publications informed this analysis:

  • Fage-Butler A., Ledderer L., Nielsen KH (Editors) (2025) – “Science Communication and Trust” – This edited volume examined relational and contextual dynamics of trust in science communication, emphasising the need for continual trust-building through credible, ethical, and responsive communication practices.
  • Teixeira da Silva JA, Daly T (2025) – “No Reward without Responsibility: Focus on Peer Review Reports” – This analysis critiqued confidentiality in traditional peer review processes, arguing that opacity contributes to mistrust in science and academic publishing whilst advocating for more transparent review mechanisms.
  • Bakhareva Y, et al. (2024) – “Issues of Trust and Motivation in Biobanking: A Literature Review” – This systematic review examined trust dynamics in biobanking contexts, providing insights into how trust operates in data-intensive research environments.
  • Cologna, V et al. (2024) – “Trust in scientists and their role in society across 67 countries” – This major international study assessed public trust in scientists globally, finding generally high trust levels but identifying scientific openness as an area requiring improvement for trust enhancement.
  • Kazantsev, Nazrul Islam, et al. (2023) – “Data sharing for business model innovation in platform ecosystems: From private data to public good” – This analysis examined how data-sharing in platform ecosystems affects business model innovation, exploring trust dynamics in commercial open data contexts.
  • Widder, David Gray, et al. (2023) – “Open (For Business): Big Tech, Concentrated Power, and the Political Economy of Open AI” – This critical examination of open AI systems explored trust implications of corporate openness claims, warning about potential co-optation of openness rhetoric for competitive advantage.

Emerging Trends and Persistent Gaps in Open Science Assessment Reform

The overarching aim of the OPUS literature review was to systematically examine academic publications released since the project’s initial landscaping exercise to identify new evidence, emerging trends, and persistent gaps that could inform the project’s central questions about reforming research assessment systems to incentivise and reward open science practices. This objective encompassed both the project’s primary focus on research assessment transformation and its investigation of five interconnected dimensions that influence open science implementation.

The review sought to answer whether recent literature provided updated input to several critical questions being addressed within the OPUS project. For research assessment and incentives, the central question concerned what incentives and rewards exist or are needed to encourage open science adoption. For precarity, the inquiry focused on whether open science practices positively or negatively impact career security and whether precarious employment conditions facilitate or hinder open science adoption. Gender equality investigations examined whether recent evidence demonstrated clearer relationships between open science and gender equity advancement. Industry practice analysis explored benefits, challenges, and implementation strategies for open science in commercial contexts. Trust investigations sought evidence of connections between open science and various dimensions of trust in scientific enterprise.

The literature review revealed varying degrees of academic attention across these domains. Research assessment and incentives generated substantial new scholarship, confirming persistent problems with traditional metrics-based systems and providing empirical evidence about researcher preferences for qualitative assessment approaches supplemented by quantitative measures. This area produced actionable insights for policy development, including the finding that researchers most favour inclusion of open science indicators in evaluation processes over purely financial incentives.

In contrast, the precarity-open science relationship remained virtually unexplored in recent academic literature, with the review identifying critical gaps in understanding how career insecurity and open science practices interact. This absence of evidence proved particularly significant given the theoretical importance of the relationship and its potential implications for open science policy effectiveness.

Gender equality analysis revealed continued complexity and context-dependency in open science-gender relationships, with evidence suggesting that whilst open science principles align with equity values, implementation contexts significantly influence whether these principles translate into gender equality advancement. The literature confirmed the need for intersectional approaches and careful attention to power dynamics within open science initiatives.

Industry practices generated moderate new scholarship focused primarily on specific sectors and collaborative mechanisms, revealing selective and strategic adoption patterns rather than wholesale industry transformation. Trust analysis provided valuable new evidence supporting public trust enhancement through scientific openness, though research on trust among scientists remained limited.

Research Assessment Reform Holds Centre Stage in Global Academic and Policy Debates

The OPUS Work Package 1 literature review provided insights into the evolving academic discourse around open science implementation, revealing both areas of scholarly attention and persistent gaps requiring further research investment. The analysis across five interconnected domains demonstrated the complexity of open science transformation whilst identifying concrete evidence to support policy development and practical intervention design.

The review confirmed that research assessment reform remains at the forefront of academic and policy debate, with substantial new evidence supporting the need for systems that better recognise and reward open science practices. The finding that researchers prefer qualitative assessment approaches supplemented by quantitative measures provides important guidance for institutions seeking to implement assessment reforms. Similarly, the identification of researcher preferences for policy-based incentives over purely financial rewards offers valuable insights for funding organisations and institutional leaders.

However, the literature review also highlighted significant gaps in academic understanding, particularly around the relationships between career precarity and open science adoption. This absence of evidence represents a critical vulnerability in open science policy development, given the potential importance of career security considerations in researcher decision-making about open science participation. The limited attention to these questions suggests an urgent need for longitudinal studies examining how career insecurity influences open science adoption and whether open science practices affect career advancement prospects.

Gender equality analysis revealed the continuing complexity of intersectional relationships within open science contexts. Whilst the literature confirmed theoretical alignments between open science principles and gender equity values, the evidence demonstrated that implementation contexts, power dynamics, and cultural factors significantly influence whether these principles translate into practical equality advancement. This finding emphasises the need for careful, context-sensitive approaches to gender equality integration within open science initiatives.

Industry engagement with open science showed selective adoption patterns characterised by strategic implementation rather than transformation. The identification of leading sectors and organisations provides valuable models for broader industry engagement, whilst the analysis of opportunities and challenges offers practical guidance for organisations considering open science adoption. The emergence of data sharing and open innovation models suggests evolving approaches to balancing openness with competitive imperatives.

Trust analysis provided encouraging evidence that open science practices can enhance public confidence in scientific enterprise, whilst revealing limited research on trust dynamics among researchers themselves. The finding that public perceptions of scientific openness lag behind other trustworthiness dimensions suggests specific areas for improvement in scientific communication and transparency practices.

The OPUS literature review ultimately demonstrated that whilst academic discourse around open science continues to evolve, significant research gaps persist in understanding the complex interactions between open science implementation and broader systemic factors affecting research culture. These gaps highlight opportunities for future research investment whilst providing a foundation for the practical interventions and metrics development that formed the subsequent phases of the OPUS project.

The systematic approach employed throughout this literature review ensured consistency with earlier project work whilst adapting to the evolving academic landscape. This methodological rigour enabled the project to trace academic discourse development over time, identifying both emerging consensus around key issues and persistent areas of uncertainty requiring further investigation. The resulting evidence base provided input to the project’s practical outputs whilst highlighting directions for future research and policy development in the critical domain of open science implementation.

Mapping the Future of Open Science (OS): The Complex Landscape of OS Initiatives
Mapping the Future of Open Science (OS): The Complex Landscape of OS Initiatives 900 900 Open and Universal Science (OPUS) Project

The transformation of scientific research towards open practices represents one of the most significant shifts in academic culture of our time. The Open Universal Science (OPUS) project, a three-year project funded under the Horizon Europe Programme has systematically mapped and analysed the evolving landscape of open science initiatives across Europe and beyond. This web article explores the project’s extensive findings, revealing how open science principles are becoming increasingly intertwined with the ongoing reform of research assessment systems.

Understanding OPUS

The OPUS project, officially titled “Open Universal Science,” ran from September 2022 to August 2025 with a clear mission: to contribute to a more inclusive, transparent, and impactful system for open science and research assessment. The project’s fundamental purpose was to move beyond traditional reliance on metrics such as journal impact factors and instead promote systems that reward open practices, enhance reproducibility, and ensure research becomes more accessible and relevant to society.

At its core, OPUS sought to bridge the gap between open science advocacy and practical implementation. The project recognised that whilst the principles of open science had gained widespread acceptance, there remained significant challenges in translating these principles into concrete rewards and incentives within research institutions and funding organisations. This challenge formed the backbone of the project’s comprehensive landscaping exercise, which aimed to identify, analyse, and connect the various stakeholders, initiatives, and frameworks operating in this space.

The project’s strategic approach was built around several key work packages, with Deliverable 1.3 representing an updated state-of-the-art analysis that built upon earlier foundational work. This deliverable served multiple purposes: documenting the evolution of the open science landscape since the project’s inception, identifying new developments and emerging trends, and providing practical input for subsequent project activities.

The Landscaping Methodology: A Systematic Approach to Mapping Open Science

OPUS employed a rigorous and systematic methodology to map the complex landscape of open science initiatives. The project’s approach was multi-faceted, focusing on four key areas: framework projects funded by European programmes, individual experts working in open science, networks and organisations promoting open science practices, and specific schemes designed to incentivise open science adoption.

The initial landscaping exercise, completed in Deliverable 1.1, established the foundation for ongoing analysis. This involved extensive desk research, direct contact with project coordinators and network leaders, and careful analysis of project deliverables and publications. The methodology was particularly thorough in its approach to project selection, focusing on initiatives funded through Horizon 2020 and Horizon Europe programmes that demonstrated direct relevance to open science and research assessment reform.

For the updated analysis presented in Deliverable 1.3, the project adopted a dual approach. First, it tracked the progress and outputs of previously identified projects, analysing new deliverables and developments. Second, it identified and analysed new projects funded since the original landscaping exercise. This approach ensured that the analysis remained current and comprehensive, capturing the dynamic nature of the open science ecosystem.

Framework Projects: The Building Blocks of Open Science Infrastructure

The OPUS landscaping exercise identified and analysed a substantial number of framework projects, each contributing unique elements to the broader open science ecosystem. These projects can be broadly categorised into several thematic areas, each addressing different aspects of the open science challenge.

Widening Participation and Strengthening the European Research Area

The PathOS Project (Open Science Impact Pathways) represents a crucial initiative in quantifying and understanding the impact of open science practices. This project has produced significant outputs including comprehensive scoping reviews of models, evidence, correlations, and causalities in open science impact. PathOS has developed a handbook of open science indicators and created methodologies for reproducibility indicators, providing essential tools for measuring the effectiveness of open science practices. The project’s work on case studies for evaluation of open science impact has been particularly valuable in demonstrating real-world applications of open science principles.

The WorldFAIR Project (Global cooperation on FAIR data policy and practice) focuses on the critical area of data management and sharing. This initiative has produced demonstrations of FAIR data exchange with independent partners and has worked extensively on cross-national social sciences survey FAIR implementation case studies. The project’s policy briefs have provided valuable insights for policymakers seeking to implement FAIR data principles at national and international levels.

The DIAMAS Project (Developing Institutional open Access publishing Models to Advance Scholarly communication) addresses the crucial area of open access publishing infrastructure. This project has conducted comprehensive landscape analyses of Institutional Publishing Service Providers (IPSPs) across Europe, developing taxonomies and best practices for institutional publishing. The project’s work on the Extensible Quality Standard in Institutional Publishing (EQSIP) represents a significant contribution to ensuring quality in institutional publishing services.

The IANUS Project (INspiring and ANchoring TrUst in Science) tackles the fundamental issue of societal trust in science and research. Through comprehensive analysis of research and policy landscapes, IANUS has produced digital handouts with best practices and recommendations for aligning research efforts with societal needs and concerns. The project’s policy briefs for policy recommendations provide practical guidance for enhancing public trust in scientific endeavours.

The VERITY Project (“deVEloping scientific Research with ethIcs and integrity”) focuses specifically on the ethical dimensions of open science. Through extensive literature reviews on public trust in science and analysis of strategies and tools to tackle societal mistrust, VERITY has contributed significantly to understanding the relationship between research integrity and open science practices.

Research Infrastructure and EOSC Development

The Skills4EOSC Project represents a comprehensive approach to building digital skills for the European Open Science Cloud. This project has developed extensive training materials, including curricula for data stewards, recognition frameworks for trainers, and learning paths for various scientific communities. The project’s work on FAIR-by-design training resources and its development of competence centre support materials demonstrate practical approaches to building open science capacity.

The GraspOS Project (next Generation Research Assessment to Promote Open Science) directly addresses the intersection of open science and research assessment. Through landscape analyses of open science-aware research assessment approaches and infrastructure architecture development, GraspOS has provided crucial insights into how assessment systems can better recognise and reward open science practices.

The FAIRCORE4EOSC Project focuses on core components supporting FAIR data principles within EOSC. The project’s development of compliance assessment toolkits provides practical tools for institutions seeking to implement FAIR data principles effectively.

The FAIR-IMPACT Project (Expanding FAIR Solutions across EOSC) works on enabling discovery and interoperability of federated research objects. Through targeted landscape analyses and development of metrics for automated FAIR assessment, this project provides essential infrastructure for scaling FAIR practices across scientific communities.

Networks and Organisations: The Collaborative Foundation

The OPUS analysis identified numerous networks and organisations that serve as crucial nodes in the open science ecosystem. These entities provide coordination, advocacy, and practical support for open science implementation.

International and Global Networks

UNESCO’s Working Group on the Open Science Monitoring Framework represents perhaps the most significant global initiative in open science coordination. Following the adoption of the UNESCO Recommendation on Open Science in November 2021, this working group has continued to develop frameworks for monitoring open science implementation globally. The group’s work on draft principles for open science monitoring, which underwent global consultation in 2024, represents a significant step towards establishing common approaches to measuring open science progress.

The Coalition for Advancing Research Assessment (CoARA) has emerged as a central force in research assessment reform. Since its establishment, CoARA has grown significantly, with member organisations developing and implementing action plans based on core commitments to reform research assessment practices. The coalition’s work on the Blueprint for Open, Connected, and Responsible Research Assessment provides a comprehensive framework for institutions seeking to transform their assessment practices.

European Networks and Initiatives

Science Europe’s Open Science Working Group continues to serve as a crucial voice in European open science policy. The organisation’s dual focus on open science and research assessment reform, coordinated through their Research Culture working group, demonstrates the interconnected nature of these areas. Their 2024 report on “Strategic Approaches to and Research Assessment of Open Science” provides valuable insights into how European funding and performing organisations are navigating the transition to open science-aware assessment practices.

The European University Association (EUA) Working Group on Science 2.0 and Open Science has maintained its commitment to supporting open science through the development of comprehensive policy frameworks. The EUA Open Science Agenda 2025 identifies three key priority areas: universal and perpetual open access to scholarly outputs, FAIR research data, and institutional approaches to research assessment, clearly demonstrating the integrated approach needed for effective open science implementation.

The EOSC Association’s Research Careers, Recognition and Credit Task Force addresses one of the most challenging aspects of open science implementation: ensuring that researchers are properly recognised and rewarded for open science practices. Their publications, including position papers on EOSC as an enabler of research assessment reform, provide crucial insights into how infrastructure development can support cultural change in research assessment.

Specialised Networks and Working Groups

The Guild Research Careers & Assessment Working Group brings together senior representatives from leading European universities to discuss career and assessment frameworks. Their open letters and statements on European Commission policy initiatives demonstrate active engagement with policy development processes.

CERN’s Open Science Working Group, whilst formally dissolved in 2023 following completion of its mandate to establish a CERN Open Science Policy, represents an important model for institutional open science governance. CERN’s early signing of the CoARA agreement and development of an institutional action plan demonstrates how major research organisations can lead by example in assessment reform.

OpenAIRE and National Open Access Desks continue to provide crucial infrastructure and support for open science implementation across Europe. OpenAIRE’s action plan supporting CoARA principles and their comprehensive strategy for 2023-2025 demonstrate sustained commitment to advancing research assessment practices aligned with open science values.

Schemes and Initiatives: Practical Implementation Tools

The OPUS analysis also identified various schemes and initiatives designed to provide practical tools and frameworks for implementing open science practices. These range from policy frameworks to technical standards and assessment tools.

The Barcelona Declaration on Open Research Information represents a significant commitment by research organisations to transform how research information is produced and used. With 163 signatories representing universities, funding organisations, and infrastructure providers, this declaration commits organisations to making openness of research information the default, working with systems that support open research information, and collaborating to achieve the transition from closed to open research information systems.

The Open Science Monitoring Initiative (OSMI) provides a coordinated approach to monitoring open science implementation globally. Through working groups covering various aspects of open science monitoring, from scoping needs to technical specifications, OSMI is developing shared resources and infrastructure for analysing scholarly outputs and building capacity for open science monitoring.

The OPUS Expert Group: A Global Community of Practice

One of OPUS’s most significant achievements was the establishment and growth of a comprehensive expert group that provides insights into the global open science community. Starting with 46 participants at the time of Deliverable 1.1, the expert group grew to 142 members by the project’s conclusion, representing a diverse and highly experienced community of open science practitioners.

Geographic and Institutional Diversity

The expert group demonstrates remarkable global reach, with participants representing 36 different countries. While 61% of participants work in EU member states, covering 16 different countries, the group also includes substantial representation from the UK, Netherlands, and countries across other continents. This geographic diversity ensures that the group’s insights reflect varied institutional contexts and policy environments.

The institutional composition of the expert group is heavily weighted towards public research-performing organisations, which represent 70% of participants. This concentration reflects the primary focus on institutional change within universities and research institutes, though the group also includes representatives from funding organisations, government bodies, and service providers.

Expertise and Experience Levels

The expert group demonstrates exceptionally high levels of expertise, with 66% of researcher participants classified as either R3 (Established Researcher) or R4 (Leading Researcher) levels. This senior representation ensures that the group’s insights are grounded in substantial experience and institutional influence.

Perhaps most significantly, the group shows remarkable depth of experience in open science, with only 17% having worked in the field for fewer than two years, while 29% have over ten years of experience. This distribution indicates that the expert group represents not just early adopters but seasoned practitioners who have witnessed and shaped the evolution of open science practices over time.

Specialisation Areas and Research Assessment

The specialisation patterns within the expert group reveal important insights about the open science community. Open access to publications remains the most common specialisation, selected by 86 participants, followed by FAIR data management and open data management. However, it is noteworthy that very few participants selected research assessment reform as their primary specialisation, despite this being a central focus of the OPUS project. This finding suggests that research assessment reform remains a specialised area that requires dedicated expertise and attention.

The expert group served multiple functions throughout the project, including providing feedback on project developments, participating in stakeholder workshops, and contributing to consultations on the OPUS Research Assessment Framework. Their continued engagement demonstrates the value of building and maintaining expert communities around open science initiatives.

The Interconnection Between Open Science and Research Assessment Reform

One of OPUS’s most significant contributions lies in documenting and analysing the increasingly close relationship between open science advancement and research assessment reform. This relationship operates on multiple levels and represents a fundamental shift in how research value is conceptualised and measured.

Policy Integration

At the policy level, European initiatives consistently link open science and research assessment reform as complementary objectives. The European Research Area Policy Agenda for 2025-2027 explicitly identifies both open science and research assessment reform as structural policies requiring long-term commitment. This integration reflects recognition that sustainable open science implementation requires fundamental changes to how researchers and institutions are evaluated and rewarded.

The Council Recommendation on research careers similarly emphasises the need for assessment reforms that ensure quality and equality while supporting open science practices. This policy alignment creates a framework where open science implementation and assessment reform can reinforce each other rather than operating as separate initiatives.

Institutional Implementation Challenges

OPUS’s analysis reveals that whilst policy support for linking open science and assessment reform is strong, institutional implementation remains challenging. The varying approaches documented across European University Alliances demonstrate that institutions interpret and implement these connections differently, often reflecting different organisational cultures, resource allocations, and strategic priorities.

The CoARA action plans provide valuable insights into how institutions are attempting to operationalise these connections. These plans reveal diverse approaches to embedding open science considerations in assessment procedures, from developing new criteria for open science activities to implementing qualitative assessment methods that can better capture open science contributions.

Technical and Infrastructure Dimensions

The relationship between open science and assessment reform also operates at technical and infrastructure levels. Projects like GraspOS demonstrate how new technical infrastructure can support assessment practices that better recognise open science contributions. Similarly, initiatives like OpenAIRE provide essential infrastructure for making research outputs discoverable and assessable in ways that support both open science goals and improved assessment practices.

The development of new metrics and indicators, documented through projects like PathOS, provides tools that can support assessment systems in recognising and rewarding open science practices. However, OPUS’s analysis also reveals ongoing challenges in developing metrics that genuinely capture the value of open science contributions without creating new forms of gaming or inappropriate quantification.

Implications for Current and Future Projects

The OPUS deliverable provides valuable insights and resources that can inform both current and future projects working at the intersection of open science and research assessment. These implications operate across multiple dimensions and stakeholder groups.

For Project Coordinators and Research Teams

Project coordinators can use the OPUS landscape analysis to identify potential collaborators, understand the broader ecosystem of initiatives, and position their work within established frameworks. The detailed project descriptions and analysis of outputs provide practical insights into what has worked well and where challenges remain.

The expert group database represents a valuable resource for projects seeking experienced advisors or partners. The geographic and expertise diversity of this group makes it particularly valuable for projects requiring international perspectives or specialised knowledge in specific open science domains.

For Funding Organisations

Funding organisations can use the OPUS analysis to understand the maturation of the open science field and identify areas where continued investment is needed. The documentation of project outputs and their impacts provides evidence for the value of sustained funding in this area.

The analysis of networks and organisations also reveals the importance of supporting coordination and community-building activities alongside specific research projects. The growth and impact of initiatives like CoARA demonstrate how relatively modest investments in coordination can generate significant multiplier effects across the research system.

For Institutions Implementing Assessment Reform

Perhaps most importantly, the OPUS deliverable provides practical resources for institutions seeking to implement assessment reforms that better recognise open science contributions. The detailed analysis of CoARA action plans, institutional case studies, and best practices offers concrete examples that institutions can adapt to their specific contexts.

The documentation of challenges and solutions across different institutional types and national contexts helps institutions understand both the opportunities and potential pitfalls of assessment reform initiatives. This knowledge can inform more effective change management strategies and help institutions avoid common implementation difficulties.

For Policy Makers

Policy makers can use the OPUS analysis to understand the current state of open science implementation and identify areas where policy intervention could be most effective. The documentation of both successes and challenges provides evidence for policy decisions about resource allocation and regulatory frameworks.

The international scope of the analysis also provides insights into how different policy approaches are working across various national contexts, offering lessons for policy makers seeking to learn from international experience.

Future Directions and Sustainability

The OPUS project’s comprehensive mapping reveals both the remarkable progress achieved in open science implementation and the significant challenges that remain. The analysis suggests several key areas where continued attention and investment will be crucial.

One of the key challenges identified through the OPUS analysis is sustaining momentum for open science implementation beyond project funding periods. Many of the initiatives documented show impressive outputs and impacts during their funded phases, but questions remain about long-term sustainability and continued development.

The growth of networks like CoARA suggests that creating sustainable governance structures and community-driven initiatives may be more effective for long-term impact than relying solely on time-limited projects. The OPUS analysis supports investment in community-building and coordination activities as complements to specific research and development projects.

Scaling Successful Approaches

The OPUS documentation of successful approaches provides a foundation for scaling effective practices across broader communities. However, the analysis also reveals the importance of adaptation and contextualisation rather than simple replication. Different institutional types, disciplinary communities, and national contexts require different approaches to implementing open science and assessment reform.

Addressing Remaining Gaps

Despite the comprehensive coverage achieved through European funding programmes, the OPUS analysis identifies several areas where significant gaps remain. These include limited attention to certain disciplinary communities, insufficient integration of industry perspectives, and ongoing challenges in developing assessment approaches that effectively recognise collaborative and interdisciplinary contributions.

A Foundation for Transformation

The OPUS project’s landscaping exercise represents far more than a catalogue of initiatives and organisations. It provides a detailed map of an ecosystem in transformation, documenting both the remarkable progress achieved and the challenges that remain in building more open, inclusive, and effective systems for research assessment and reward.

The project’s findings demonstrate that the integration of open science principles into research assessment is not merely a policy aspiration but an active area of innovation and implementation across Europe and beyond. From major institutional commitments through CoARA to technical infrastructure development through EOSC initiatives, the foundations for transformation are being systematically constructed.

Perhaps most significantly, the OPUS analysis reveals that this transformation is being driven by a committed and experienced community of practitioners, as evidenced by the expert group’s composition and engagement. This community provides both the expertise and the institutional connections necessary to sustain change beyond individual project funding periods.

For current and future projects working in this space, the OPUS deliverable provides both a comprehensive resource and a challenge. The resource lies in the detailed documentation of what has been achieved, who is involved, and how different approaches have succeeded or struggled. The challenge lies in building upon this foundation to address the remaining gaps and ensure that the promise of open science transformation is fully realised across the research system.

The close links between open science and research assessment reform, documented throughout the OPUS analysis, suggest that sustainable progress requires integrated approaches that address both cultural change and practical implementation challenges. Projects that recognise and build upon these interconnections are more likely to achieve lasting impact than those that treat open science and assessment reform as separate domains.

As the research landscape continues to evolve, the OPUS project’s mapping provides an essential baseline for understanding progress and identifying priorities for continued development. The project’s legacy lies not just in its documentation of the current state but in its demonstration of the collaborative, systematic approach needed to support continued transformation of research practices and values.

University of Rijeka Showcases Future-Oriented Research Assessment at PUBMET2025
University of Rijeka Showcases Future-Oriented Research Assessment at PUBMET2025 1024 742 Open and Universal Science (OPUS) Project

ZAGREB, 11 September 2025 – At the PUBMET2025 conference in Zagreb, representatives of the University of Rijeka (UNIRI), Nataša Jakominić Marot and Saša Zelenika, presented their paper Research Assessment at the European University of the Future, outlining the institution’s comprehensive reform in research assessment and its vision for academia’s role in shaping European higher education.

As part of its 2021–2025 strategy, UNIRI has placed its community of teachers, researchers, students and citizens at the centre of its development, embedding engagement, accountability, equity, efficiency, and sustainability within its academic framework. The Croatian university has long stood at the forefront of progressive reforms, becoming the first in the country to adopt the European Commission’s Human Resources Strategy for Researchers (HRS4R) in 2010, and retaining its HR Excellence in Research designation through renewals in 2019 and 2024.

Reforming Research Assessment

UNIRI’s leadership has been vocal in moving away from conventional, metrics-driven research assessment models, which it argues disadvantage early and mid-career researchers. In 2022 the university became an early signatory of the Agreement on Reforming Research Assessment (RRA) under the Coalition for Advancing Research Assessment (CoARA). Since then, it has actively contributed to the coalition’s working groups, especially those focusing on career assessment and the challenges faced by emerging researchers.

Within this framework, UNIRI has implemented a detailed action plan with clear timeframes and responsibilities, and was more recently awarded a Horizon Europe CoARA Boost Teaming project to further strengthen its commitment.

Collaborations and Alliances

The university has drawn extensively on its membership in European alliances such as the Young Universities for the Future of Europe (YUFE) and Young European Research Universities Network (YERUN). Through projects including YUFERING and DIOSI, UNIRI has co-created tools such as a researcher competence framework, an innovative doctoral training model, and narrative CV models for postdoctoral selections.

Through YERUN, UNIRI has also been part of policy-level initiatives promoting recognition and reward mechanisms that extend beyond publication metrics, valuing the diversity of researchers’ contributions across academia, education, leadership, and impact on society.

Contributions to EU Projects

UNIRI’s commitment extends to pilot roles in major EU-funded projects. In the Horizon Europe project SECURE, the university helped develop an innovative Research Career Framework addressing stability, skills, mobility, assessment, and professional pathways. Its successful implementation of 14 targeted measures has contributed to a comprehensive toolbox comprising 80 actions designed for research-performing and funding organisations across Europe.

Within OPUS, where the consortium is working to reshape researcher assessment around open science principles, UNIRI has piloted measures in research, education, and valorisation. These included interventions in areas such as policies, repositories, training, and awareness-raising, reinforcing transparency and mutual learning among partner organisations.

Open Science and Institutional Change

Open Science (OS) remains a pillar of UNIRI’s strategy. The university issued its Declaration on European OS in 2019, adopted its OS policy in 2021 (since updated in 2025), and signed the San Francisco Declaration on Research Assessment (DORA). Alongside EU partners, UNIRI has supported initiatives such as the YUFE OS calendar and the YERUN OS awards, while also promoting ethical AI use through its 2024 AI Tools Usage Policy, shaped by its Centre for AI and Cybersecurity.

The university’s Science Outreach Centre further boosts its citizen science initiatives, pairing researchers with professional science journalists to enhance science communication and community engagement.

Driving Systemic Change

UNIRI has played an influential role nationally as well, advocating for qualitative assessment criteria and integrating open science into Croatia’s evolving framework for academic recruitment and promotion. At the institutional level, it has adopted new regulations and guidelines that redefine criteria for academic staff selection, aiming to nurture a cultural shift among researchers in values, attitudes, and expectations.

Despite challenges and the continued importance of national-level support, UNIRI’s practices highlight the transformative potential of institutional initiatives. Through transparency, inclusivity, strong leadership, and collaboration at the European level, the University of Rijeka demonstrates that universities can act as catalysts for reform, raising both academic quality and international visibility.

UNIRI’s presentation in Zagreb underscored that reforming research assessment is not only about improving metrics but about fostering an environment where diverse contributions are recognised, where careers are sustainable, and where universities are empowered to truly embody the concept of a European university of the future.

References

Coalition for Advancing Research Assessment (CoARA). (2022). The Agreement on Reforming Research Assessment. https://coara.eu/app/uploads/2022/09/2022_07_19_rra_agreement_final.pdf

University of Rijeka. (2024a). Artificial Intelligence Usage Policy at UNIRI. https://zenodo.org/records/11080236

University of Rijeka. (2023a). Guidelines for Additional Criteria for the Selection to Scientific- Teaching, Artistic-Teaching, Teaching, Associate, and Professional Positions of Academic Staff at the University of Rijeka and its Constituents. https://zenodo.org/records/12800551

University of Rijeka. (2025). Politika otvorene znanosti Sveučilišta u Rijeci. https://uniri.hr/wp-content/uploads/2025/02/Politika-otvorene-znanosti_UNIRI_revizija-2025.pdf

University of Rijeka. (2023b). Rules and Regulations on Scientific, Artistic, and Innovation Activities at UNIRI. https://zenodo.org/records/13383045

University of Rijeka. (2024b). University of Rijeka Human Resources Strategy for Researchers – Strategic Priority Areas and Action Plan for their implementation. https://uniri.hr/wp-content/uploads/2024/03/UNIRI_HRS4R_2023-AP-1.pdf

University of Rijeka. (2023c). University of Rijeka Reform of Research Assessment – CoARA Action Plan 2024-2027. https://zenodo.org/records/10634416

University of Rijeka. (2021). University of Rijeka Strategy 2021 – 2025: European University of the Future. https://uniri.hr/wp-content/uploads/2021/04/University_of_Rijeka_Strategy_2021-2025.pdf

Short Biographies of Speakers

Saša Zelenika is a full professor with tenure at the University of Rijeka, Faculty of Engineering. He was Assistant Minister for Science and Deputy Minister at the Croatian Ministry of Science, Education and Sports. He has authored more than 220 scientific and professional publications, 2 textbooks, a patent, 62 broadening newspaper articles, 122 broadening talks and 18 scientific broadening radio shows, and actively participates(ed) in 11 EU FP projects. As UNIRI’s Vice-Rector for Strategic Projects he was in charge of innovation and knowledge valorization activities, head of the EDIH Adria European Digital Innovation Hub, of the HE INNO2MARE Excellence Hub’s Innovation Council, has fostered University’s activities related to open science and AI policies, and has lead the UNIRI activities concerning the reform of research(er) assessment. Currently he is a.i. Rector’s Advisor for Strategic Projects at the University of Rijeka.

Nataša Jakominić Marot is Head of the University of Rijeka Centre for Research and Innovation and an experienced leader in research and innovation. Over two decades in the sector, she has demonstrated expertise in EU project preparation, implementation, and supervision, managing and supervising a portfolio exceeding 100 mEUR. In the last 10 years, she has contributed to 6 Horizon Europe/Horizon 2020 projects in various roles. She is a trainer in project and research management, EU funding, and leadership. She is an expert in EU higher education and R&I policy and international peer reviewer and trainer in her fields of expertise. She is an active member of EARMA, YERUN Policy Platform and two CoARA working groups, and has been at the forefront of the efforts to reform research assessment in Europe, also by strongly promoting the principles of Open Science.

OPUS impact through 3 years of project activities
OPUS impact through 3 years of project activities 1024 683 Open and Universal Science (OPUS) Project

From the outset, OPUS set out to reform researcher assessment by expanding beyond traditional publication metrics and recognising the full spectrum of scholarly contributions, especially those rooted in open science principles. The main pillar of this effort has been the OPUS Researcher Assessment Framework, a set of qualitative and quantitative indicators co-designed and tested by three research performing organisations and two research funding organisations. By engaging these pilots in iterative mutual-learning exercises and action planning, OPUS ensured that its framework was not a theoretical construct but a practical tool tailored to institutional realities.

Parallel to the framework development, OPUS introduced the Open Science Career Assessment Matrix 2, an instrument designed to reward practices such as early data sharing, transparent peer review and much more, enabling researchers to see their open science efforts reflected in career progression decisions. Through workshops on gender equality in assessment, data management plan training, and industry-academic collaboration seminars, OPUS equipped over 1000 participants with the knowledge and resources needed to adopt these novel assessment criteria.

Key exploitable results, six in total, were identified and published under Creative Commons licences on Zenodo and the Open Research Europe platform. These include the Researcher Assessment Framework itself, the interventions guide for implementing the framework, final action plans from pilot institutions, the outcomes of mutual-learning sessions, a comprehensive policy brief on embedding open science in institutional strategy, and the refined assessment matrix. Collectively these resources have amassed over 1,500 views and downloads, ensuring that they are readily available for any organisation seeking to modernise its evaluation processes.

Researchers stand to benefit from OPUS in several ways. Firstly, the framework and matrix provide clear, validated criteria for open science activities, allowing individuals to demonstrate their contributions beyond publications and citation counts. Secondly, the pilot institutions now possess tailored action plans and institutional buy-in, meaning that early adopters can draw on lived experiences and case studies when advocating for change in their own organisations. Thirdly, the wealth of training materials, recorded interviews and policy briefs serve as ready-to-use resources for research managers, funders and policymakers looking to embed open science incentives into evaluation systems.

Moreover, OPUS’s extensive network of over 150 open science experts, 36 Horizon Europe projects and 34 related networks represents a vibrant community ready to share best practices, troubleshoot implementation challenges and foster interdisciplinary collaboration. The project website is offering researchers ongoing support as they navigate the evolving landscape of open science and responsible assessment.

The OPUS project has achieved much more than simply meeting its targets across dissemination, communication and exploitation activities. Over the course of three years, OPUS organised seven major events instead of the six originally planned, produced twenty publications where ten had been expected, and delivered over thirty presentations against a target of nine. Its digital footprint expanded dramatically with the project website attracting an average of 2,420 page views per month rather than the projected 750, its combined LinkedIn and Twitter following soaring to 2,900 from an initial goal of 500, and its newsletters reaching 348 readers per issue rather than 200. Equally impressive have been the training and public engagement outcomes, with fifty-four workshops delivered in place of nine, sixteen public meetings held instead of five, and twenty-nine interviews recorded rather than ten. These figures alone illustrate a project that has excelled quantitatively, but OPUS’s real success lies in the depth of its impact on open science practices and researcher evaluation across Europe.

In short, OPUS has laid a durable foundation for research assessment reform. Its practical tools, verified by real-world pilots, and the extensive guidance it offers, empower individual researchers and institutions alike to embrace openness, transparency and equity in scholarly evaluation. As the project’s six key exploitable results remain freely accessible, the legacy of OPUS will continue to influence research culture long after its formal conclusion, ensuring that open science practices are recognised, rewarded and sustained across Europe and beyond.

VIDEO: The Impact of OPUS Project in Action
VIDEO: The Impact of OPUS Project in Action 1024 641 Open and Universal Science (OPUS) Project

The Open and Universal Science project was established to support the research and innovation community by changing the way researchers are assessed, and by incentivising and rewarding Open Science within research organisations.

This EU-funded project is being delivered by a consortium of eighteen organisations, led by PLOCAN, and has been running from 1 September 2022 to 31 August 2025, under an agreement with the European Research Executive Agency (REA). Structured around seven work packages, OPUS has not only sought to reward Open Science but also to foster a research culture where knowledge sharing is central to everything we do.

Within Work Package 1, Open Science State of the Art, we set out the broader context of the OPUS project in relation to the wider Open Science landscape. We mapped more than 40 European projects, 30 global and European initiatives, and a substantial body of literature, while engaging with over 150 experts. A key component of OPUS was the development of a Researcher Assessment Framework, a set of indicators and interventions designed to motivate and incentivise researchers to adopt Open Science practices, and to enable their assessment not only in project evaluations but also in career progression. These indicators cover activities such as research, education, leadership, and valorisation, forming a flexible ‘toolkit’ to help both research-performing and research-funding organisations reward their researchers more effectively. The aim is to move beyond recognition based solely on publications, or even more narrowly on metrics such as the H-index, impact factor, or the number of publications. We designed a practical framework of interventions, categorised as policy, resources, awareness, and training — to help pilot institutions implement the Researcher Assessment Framework (RA Framework) and OSCAM 2. OPUS selected five pilot organisations, three research-performing and two research-funding, to test these interventions over an 18-month period. The pilots achieved excellent results, often surpassing their targets. While the RA Framework requires further testing and additional piloting, the lessons from OPUS provide a strong foundation for other organisations seeking to advance Open Science practices and responsible research assessment.

Our work on policy within Work Package 5 is highly relevant not only to the European Research Area but also to UNESCO. The 2017 Recommendation on the Status of Scientific Researchers emphasises the need to open science to society in order to respond more effectively to societal challenges. It also highlights the importance of increasing diversity within the research community, ensuring that everyone has the right to participate in and benefit from science, in line with Article 27 of the 1948 Universal Declaration of Human Rights. The insights gained from OPUS, together with case studies of how institutions implement Open Science action plans, will provide valuable policy support and advice for member states — not only across Europe but also globally, in implementing the UNESCO Recommendation on Science and the 2021 Recommendation on Open Science.

As one of its main achievements, OPUS is proud to present the revised Open Science Career Assessment Matrix 2, a key component of the Researcher Assessment Framework. This version places particular emphasis on indicators that incentivise and reward Open Science, encouraging researchers not only to be assessed more fairly but also to actively practise Open Science.

How do we translate open science monitoring policies into real, sustainable institutional practices?
How do we translate open science monitoring policies into real, sustainable institutional practices? 1024 683 Open and Universal Science (OPUS) Project

Organised by UNESCO, OPUS, OpenAIRE, and OSMI on 8 July 2025, as part of the Open Science: Monitoring Progress, Assessing Impact conference at UNESCO Headquarters in Paris, a crucial joint session brought together leading international initiatives (projects) to address one of the most pressing questions in research policy: how do we translate open science monitoring policies into real, sustainable institutional practices?

About the Session: The Role of RPOs and RFOs in Open Science Monitoring: Connecting Institutional Practices to Policy Aggregation

Date: Tuesday, 8 July 2025
Organised by: UNESCO, OPUS, OpenAIRE, and OSMI

This roundtable session explored the practical realities behind policy implementation, addressing the mounting pressures facing research-performing organisations (RPOs) and research-funding organisations (RFOs) as they navigate rising demands for accountability, transparency, and alignment with evolving open science frameworks.

Experiences and Barriers that Institutions Face While Monitoring Systems

As open science frameworks proliferate across Europe and beyond, a critical gap has emerged between policy vision and operational capacity. The session moved beyond theoretical discussions to surface lived experiences, assumptions, and barriers that institutions face when implementing monitoring systems. Key topics included institutional-level engagement with EU monitoring recommendations, feasibility and cost concerns for implementation, researcher engagement and resistance, and gaps between policy vision and operational capacity.

Session Participants

Moderator:

  • Gareth O’Neill, Principal Consultant on Open Science, Technopolis Group Belgium (OPUS consortium partner)

Roundtable Contributors:

  • Natalia Manola, CEO, OpenAIRE AMKE
  • Laetitia Bracco, OSMI Co-coordinator, Université de Lorraine
  • Alina Irimia, Open Science Projects Coordinator, UEFISCDI-Romania (OPUS pilot RFO organisation)
  • Nataša Jakominić Marot, Head of Research and Innovation, University of Rijeka, Croatia (OPUS pilot RPO organisation)
  • Manuel Ricardo Galindo Moreno, Science, Technology, and Innovation Policy, UNESCO

Natalia Manola, CEO, OpenAIRE AMKE 

OpenAIRE is a European infrastructure for open science and scholarly communication. Naturally, one of our major interests is how to monitor open science.

Over the years, we have realised that, as an infrastructure builder and provider, it is essential for us, when creating infrastructure, to ensure that from day one it is capable of gathering the evidence needed to measure policies on open science. We understand that monitoring can happen at various levels — for example, at the institutional level, where decisions need to be taken quickly in ways that influence organisational policies and researchers.

This means a strong focus on quantitative indicators. As we move up to higher levels, however, it also becomes a question of strategic alignment. Within OpenAIRE, we have a working group on monitoring open science policies, led by Eloy Rodriguez from the University of Minho. Given that we are a member-based organisation, this group is extremely valuable, members share the types of monitoring they carry out, and we can see that there is a great deal of overlap between indicators.

One of the major challenges we see, from the perspective of infrastructures, is that overlapping indicators make evidence collection resource-intensive and highly demanding for both organisations and funders. Multiple monitoring initiatives often request similar or identical data, which forces organisations to populate various systems, duplicating effort.

This is why infrastructures must be ready to support evidence collection and help organisations, funders, and institutions to extract this evidence from their own systems, so it can feed into the different monitoring frameworks. One way to reduce overlap is through better coordination between monitoring initiatives, something we can see happening, for instance, through OSMI, which is very encouraging and important.

Secondly, infrastructures should provide the evidence required by the various monitoring initiatives — and this needs to be well-coordinated and improved. Thirdly, institutions themselves need to better understand where their evidence can be most effectively used for monitoring purposes, and to take a more strategic approach in deciding which indicators they track internally versus those they report externally.

I don’t think this is a matter of “research washing” or “greenwashing”, monitoring open science is critically important. It can not only help us measure progress, but also identify gaps in implementation and guide the development of targeted support interventions. In other words, it’s not just about measurement; it’s about informed action and improvement.

Gareth O’Neill (Moderator)

The Open Science Monitoring initiative currently involves nearly 200 people and more than 100 research institutions. It was established to promote transparency, inclusiveness, and adaptability in tracking open science progress at a global level.

Laetitia Bracco, OSMI Co-coordinator, Université de Lorraine 

OSMI stands for the Open Science Monitoring Initiative. So, what is it about? As you may have guessed, it’s about open science monitoring.

At present, OSMI brings together nearly 200 people and more than 100 research institutions. It was established to promote transparency, inclusiveness, and adaptability in tracking open science progress globally. OSMI is organised through several working groups that support and advance its activities.

It operates as a grassroots initiative, engaging in consultation cycles with the community. Through this approach, it develops principles and guidelines for open science monitoring, ensuring that monitoring efforts are coordinated, comprehensive, and aligned with the diverse needs of stakeholders. In essence, the goal is to strengthen coordination and alignment across the many different initiatives already under way.

Let me also briefly outline OSMI’s core goals. These are:

  • To support the development and implementation of transparent, accountable, and sustainable monitoring systems that advance open science at local, national, and international levels.
  • To foster dialogue and knowledge sharing among researchers, policymakers, funders, and other stakeholders involved in open science monitoring — building a strong and connected community around these efforts.
  • And finally, to provide resources and practical guidance to help organisations implement effective monitoring practices — ensuring that we share the same understanding of what monitoring means and apply similar, well-informed practices across contexts.

So, in short, OSMI is about building principles, providing resources, and helping the community move forward together, speaking the same language when it comes to open science monitoring.

Alina Irimia, UEFISCDI-Romania, OPUS pilot RFO organisation

Regarding the groundwork of funding in the context of the OPUS project, in Romania I also wear the hat of national reporter for UNESCO, responsible for reporting on the implementation of the Recommendation on Open Science. This means I have visibility over monitoring both at the funding organisation level and at the national level.

From the perspective of the funding organisation, we have several types of monitoring activities. One is compliance monitoring, which is about ensuring that funded projects meet open science requirements, for example, making sure that publications resulting from funded research are openly accessible, or that research data is shared according to the FAIR principles, where applicable.

We also carry out performance monitoring, which assesses how effectively open science practices are being adopted and the extent of their impact on research outcomes. This could involve examining citation rates of open access publications, download statistics for shared datasets, or mapping the collaboration networks that emerge from openly conducted projects.

The third type is strategic monitoring, which evaluates the broader impact of our open science policies and how well they align with national and international goals. This helps us determine whether our funding programmes are effectively promoting open science and identify where adjustments may be needed.

From a national perspective, the monitoring process becomes even more complex. We need to aggregate data from multiple sources, such as funding organisations like ours, research institutions, libraries, and other stakeholders. We look at indicators that measure progress towards the country’s open science goals, as well as Romania’s performance in comparison to other nations.

However, there are notable challenges. One is data collection and harmonisation, different organisations may work with different metrics or definitions, making it difficult to develop a coherent national picture. We also face resource constraints: effective monitoring requires dedicated personnel and robust technical infrastructure, which can be especially challenging for smaller organisations.

Another challenge is engaging researchers and institutions. Many perceive monitoring as an additional administrative burden rather than a tool for improvement. We therefore need to communicate its value more clearly and show how it can help institutions and researchers enhance their practices.

Looking ahead, I see opportunities for better coordination between monitoring initiatives, as Natalia mentioned, and for leveraging technology to automate data collection, reducing the effort required from researchers and institutions. There is also considerable potential to move beyond simple compliance metrics towards more meaningful indicators of impact.

In my view, we need to work collaboratively to develop monitoring systems that are genuinely useful, efficient, and effective, systems that actively help us to advance open science, rather than merely measure it.

Nataša Jakominić Marot, Head of Research and Innovation, University of Rijeka, Croatia

At the University of Rijeka, what we did was select from the assessment framework in OPUS, which, I have to say, was like a menu, and a very comprehensive one at that. It contained eight dimensions, each with several practices and indicators, so, really, quite a big menu to choose from.

From this, we selected around 15 practices that we felt best suited our current situation and strategic priorities. We didn’t try to implement everything at once; instead, we chose what made the most sense for us at this stage. Some of the practices we selected included:

  • open access publishing policies,
  • research data management,
  • open educational resources, and
  • transparent evaluation criteria for hiring and promotion.

What was interesting was that, when we began implementing these, we found that some things we had assumed would be easy turned out to be quite complex, and, conversely, some things we thought would be difficult were actually relatively straightforward. For example, we expected implementing open access policies to be simple, but we encountered resistance from some researchers who were concerned about publication costs and journal quality. On the other hand, introducing research data management practices was easier than expected, because many researchers were already keen to better organise their data.

One of the clearest lessons we learned was the importance of communication and training. We couldn’t simply announce new policies and expect them to be adopted. We had to explain why these practices matter, provide training on how to apply them, and offer ongoing support. We organised workshops, developed guidance documents, and set up help desks where researchers could bring their questions.

Another lesson was about timing and sequencing. We found it far better to implement practices gradually, making sure each one was embedded and working effectively before moving to the next. We also realised that some practices are interconnected, for example, open access publishing and research data management tend to go hand in hand, so it can be sensible to address them together.

From a monitoring perspective, we discovered that it is vital to have both quantitative and qualitative indicators. Numbers can tell you what is happening, but they don’t always explain why. We therefore combined metrics, such as the percentage of open access publications, with qualitative feedback from researchers about their experiences and challenges.

We also concluded that monitoring should be used not just for checking compliance, but as a tool to identify where further support is needed. If we saw researchers struggling with a particular practice, that was a signal for us to improve our training or support services.

One challenge we faced was balancing institutional priorities with the needs of individual researchers. Some were very enthusiastic about open science practices, while others were sceptical or worried about the extra workload. We addressed this by focusing on the benefits of these practices, rather than simply presenting them as requirements.

Another challenge was aligning our institutional practices with national and international monitoring requirements. At times, the indicators that made perfect sense for us internally did not align fully with what national or EU-level systems were asking for. This created an additional administrative burden in terms of data collection and reporting.

Looking at the bigger picture, I believe there is a clear need for better coordination between different levels of monitoring. If institutional, national, and international systems were better aligned, the administrative burden would be reduced, and the data collected would be more meaningful.

I also think there is great value in sharing experiences between institutions. What works well at one university may not work at another, but there are always lessons to be learned. Having platforms where institutions can share best practices, and also challenges, would be extremely useful.

Finally, I want to emphasise that monitoring should ultimately serve the goal of improving open science practices. It is not just about collecting data or meeting reporting requirements, it is about creating a culture in which open science practices are genuinely valued and supported.

Manuel Ricardo Galindo Moreno, UNESCO

From UNESCO’s perspective, we are well aware of the principles proposed for monitoring by OSMI. We also have the UNESCO Recommendation on Open Science, which we have been implementing and tracking through a global survey.

The first survey, conducted in 2023 with our Member States, received 118 responses, each providing insights into the level of implementation of the Recommendation.

What we found is that there are significant variations in how different countries approach open science monitoring. Some have well-developed national strategies and sophisticated monitoring systems in place. Others are only at the very beginning of developing their approaches. This reflects, of course, differing levels of resources, infrastructure, and political commitment to advancing open science.

One of the key challenges we identified is the urgent need for capacity building, particularly in developing countries. Many nations aspire to make progress in open science but lack either the technical expertise or the institutional infrastructure to design and implement effective monitoring systems. This is precisely where international cooperation and knowledge sharing become so critical.

We also observed that there is often a disconnect between national policy commitments and what actually happens at the institutional level. A country may have strong, well-written policies on paper, but translating these into practice within universities, research institutes, and other organisations requires significant effort, resources, and coordination. In this context, initiatives like OPUS play an essential role, helping to bridge that gap between policy and practice.

Another important finding was the role of stakeholder engagement. Countries that achieved more success in implementing open science were typically those that actively engaged a wide range of stakeholders, researchers, institutions, funders, publishers, and civil society, in shaping their national approaches. This highlights the need for monitoring systems to be developed through genuinely inclusive processes.

From a global perspective, we believe monitoring systems must be able to capture both progress and the challenges along the way. It is not enough simply to measure what is working; there is equal value in identifying the barriers, understanding why they exist, and finding ways to address them. This requires going beyond simple quantitative metrics to indicators that can reflect the complexity and nuance of the open science transition.

We are also very interested in how monitoring can directly support evidence-based policymaking. The data collected should not sit idle. It should inform decisions about where to invest resources, what kinds of support to provide, and how best to adapt policies based on clear evidence of what is effective and what is not.

Looking ahead, we see real opportunities for greater coordination between the many different monitoring initiatives currently operating. Instead of having parallel systems that operate in isolation, we could be working towards more integrated approaches, systems that serve multiple purposes, are interoperable, and reduce the reporting burden on both institutions and researchers.

Gareth O’Neill (Moderator)

I’m wondering, at which level should alignment happen? Is it at the national level, considering there are different monitoring systems even within Europe? For instance, we have the European Research Area monitoring, the EOSC monitoring, and various other systems. So, the question is: at what level should this alignment take place? And how can we ensure that institutions do not have to report to multiple systems with differing requirements? 

Laetitia Bracco, OSMI Co-coordinator, Université de Lorraine

I think that OSMI is aligning in three different ways.

Firstly, through the principles, as I’ve said, these are the result of extensive consultation with the community, including a range of different monitoring initiatives. These principles reflect a shared consensus on what good monitoring should look like, regardless of the specific context or level. They emphasise values such as transparency, inclusiveness, adaptability, and proportionality. These are relevant whether you’re monitoring at the institutional, national, or international level, and they provide a common framework that different initiatives can use to align their approaches.

Secondly, there is the community aspect. OSMI brings together people from different monitoring initiatives, different institutions, and different countries. This creates real opportunities for dialogue and mutual learning. When people are talking to one another, they are far more likely to spot opportunities to align their work and to avoid unnecessary duplication.

And thirdly, we are working on practical resources and guidance, things like indicator frameworks, data collection templates, and best practice guides. The idea is to provide ready-to-use tools which different initiatives can adapt for their needs, rather than having everyone reinvent the wheel.

But you raise an important point about the multiplicity of monitoring systems. This is indeed a challenge. From an institutional perspective, having to report to multiple systems, each with different requirements, can be both burdensome and inefficient. From a policy perspective, fragmented monitoring makes it much harder to build a coherent picture of progress.

I think there are several steps that could help here.

One is better coordination at the planning stage. If monitoring initiatives talked to one another while designing their systems, they could identify ways to align and find opportunities for shared data collection.

Another is the development of common data standards and formats. If different systems use the same definitions and formats, it becomes much easier to share data and reduce duplication. This doesn’t mean all systems have to be identical — but they should be interoperable.

We also need to consider the user experience: what it is like for institutions and researchers who are providing the data. If we can design systems that minimise the reporting burden while still meeting the needs of different users, everyone benefits.

I think technology also has a role to play here. We need systems that can automatically extract and format data from institutional sources, rather than requiring manual entry for every initiative. We also need dashboards that can present the same underlying data in different ways, depending on the purpose.

But ultimately, alignment is not just a technical challenge. it’s also a political and organisational one. Different monitoring initiatives exist because they serve different stakeholder needs. Funders require different information than policymakers, and policymakers often need different information than research institutions. The challenge is to meet these diverse needs while avoiding duplication and reducing burden.

This is where I believe initiatives like OSMI can add real value, not by trying to replace existing monitoring systems, but by helping them work together more effectively. We can provide a forum for dialogue, develop shared principles and standards, and create resources that all can use.

Of course, this requires buy-in from every stakeholder. Monitoring initiatives need to recognise the value of coordination and be willing to adapt their approaches. Institutions need to feel confident that coordination will actually reduce their workload, rather than adding another layer of complexity.

I think we are making progress, but there is still much work to do. The fact that we’re having this conversation, and that projects like OPUS are actively testing these approaches on the ground, is very encouraging. But we need to keep pushing for stronger coordination and alignment.

One point I’d like to highlight is the importance of learning from what doesn’t work. We should be honest about where our monitoring systems are falling short, and be willing to adapt. That means creating feedback loops where those providing data can share their experiences and suggest improvements.

It also means being willing to stop doing things that add no value. Sometimes, the most effective way to reduce burden is to eliminate redundant or low-priority monitoring activities, and that requires honest conversations about priorities and trade-offs.

We can also learn from other sectors that have faced similar coordination challenges, for example, healthcare, environmental monitoring, or quality assurance in manufacturing. These fields have developed methods for aligning multiple monitoring systems while still preserving their specific purposes.

Finally, I want to emphasise that alignment should not mean full standardisation. Different contexts need different approaches. The objective is to reduce unnecessary burden while still allowing for diversity and innovation in how monitoring is done.

So, in summary: alignment can take place at multiple levels, but it requires deliberate effort, coordination between initiatives, the development of shared standards and tools, and keeping the user experience front and centre. OSMI is working to contribute to this, but ultimately, it has to be a collective effort involving all stakeholders.

Pil Maria Saugmann , European Council of Doctoral Candidates and Junior Researchers

I am the current President of Eurodoc, the European Council of Doctoral Candidates and Junior Researchers, and we represent around 35 member organisations across Europe.

From the perspective of early career researchers, we often see a disconnect between the high-level policy discussions on open science monitoring and the reality of what researchers are experiencing on the ground. Many of our members tell us that they fully support open science in principle, but they face very real, practical barriers that make it difficult to put into practice.

For example, many early career researchers worry about the career implications of engaging in open science practices. If their institution tells them that it values open science, but promotion and evaluation criteria still prioritise traditional metrics such as the journal impact factor, they receive mixed messages. This creates a tension between doing what they genuinely believe is the right thing for science and doing what they believe will advance their careers.

There is also the matter of resources and support. Many early career researchers want to practise open science, but they simply do not know how to start. They need training in areas such as research data management, open access publishing, and collaborative and transparent research practices. However, this training is not always available, or it is poorly integrated into doctoral and early career development programmes.

Another challenge is the additional workload that open science practices can create, especially at the beginning. Learning new tools, adopting proper data management protocols, and navigating open access requirements all take time. For researchers already stretched thin by research, teaching, and administrative responsibilities, this can feel overwhelming.

From a monitoring perspective, we believe it is important to capture not just what researchers are doing, but also the support they are, or are not, receiving. Are institutions providing adequate training? Are career advancement criteria genuinely aligned with open science values? Is there sufficient infrastructure and resourcing to enable the wider adoption of open practices?

It is also worth noting that early career researchers are often the most enthusiastic advocates of open science, but we are also the most vulnerable to the potential risks. We may be more willing to share our data or publish in open access journals, but we are also more acutely aware of the possible negative implications for our career progression.

This suggests that monitoring systems need to take into account both generational differences and career-stage effects. What might appear to be slow institutional progress could in fact reflect the understandable caution of more established researchers when it comes to changing long-held practices.

We would also like to see more participatory approaches to monitoring. Instead of just measuring researcher behaviour from the outside, why not actively involve researchers, especially early career ones, in designing these systems? We often have valuable insights into which indicators are truly meaningful and which data collection methods are practical and realistic.

Finally, I want to stress that monitoring should exist to serve researchers, not the other way around. The ultimate goal should be to create systems that help us do better science and have greater impact, rather than simply ticking administrative boxes.

There are, fortunately, some positive developments. Increasingly, more funders require open science practices, which is creating incentives for institutions to strengthen their support. More institutions are also revising their evaluation criteria to include open science indicators. Still, there remains a great deal of work to align the different levels of the system so that policies, funding, and institutional practice work in harmony.

I believe initiatives like OPUS and OSMI are particularly valuable because they bring together diverse stakeholders to address these alignment challenges. But it is essential that early career researchers have a place at the table in these conversations, because we are the ones who will ultimately be implementing these practices throughout our careers.

So, I would encourage everyone involved in designing monitoring systems to think from the perspective of the researcher, and to actively involve us in the process. After all, if monitoring systems do not work for researchers, they are not going to work at all.

Global Leaders Converge in Paris to Chart the Future of Open Science Monitoring
Global Leaders Converge in Paris to Chart the Future of Open Science Monitoring 1024 683 Open and Universal Science (OPUS) Project

The OPUS team participated at the Open Science: Monitoring Progress, Assessing Impact conference, which took place on 7 & 8 July 2025, at UNESCO Headquarters in Paris. 

Over two days at UNESCO Headquarters and online, more than 500 participants from across the globe gathered for the Open Science: Monitoring Progress, Assessing Impact conference, an event uniting policymakers, researchers, and open science practitioners in defining how to track, evaluate, and advance the global shift towards more transparent and equitable research.

Organised by UNESCOOpenAIREPathOSEOSC Track, the French Ministry of Higher Education and ResearchInria, and OSMI, the conference marked the culmination of the EU-funded PathOS project and showcased the launch of new tools and frameworks set to shape open science policy worldwide.

A Global Call to Action from UNESCO

Opening the event, Lidia Brito, UNESCO’s Assistant Director-General for Natural Sciences, underscored the urgency of building robust, inclusive monitoring systems:

“Open Science is not a choice, it is a necessity for the equitable and sustainable future we seek. Measuring impact is the key to ensuring our efforts truly benefit science, society, and the planet.”

Lidia Brito, UNESCO’s Assistant Director-General for Natural Sciences

Her address set the tone for two intense days blending visionary discussions with concrete tools for policymakers, research institutions, and funders.

PathOS: Evidence, Methods, and Impact Pathways

The PathOS project, a three-year EU-funded project, took centre stage in multiple sessions. Ioanna Grypari (ATHENA RC & OpenAIRE) introduced the project’s innovative evidence base, methodological approaches, and hands-on tools, including real-world case studies revealing both the tangible benefits and persistent blind spots in open science adoption.

Tony Ross-Hellauer (Graz University of Technology) expanded on the project’s strategic lessons, inviting open dialogue on how PathOS findings can translate into actionable policies and clearer pathways for impact.

Turning Indicators into Insights

One of the conference’s key findings came from Vincent TraagJessica Catalano, and Louis Colnot, who presented the Open Science Impact Indicator Handbook alongside a Cost-Benefit Analysis Framework. These resources aim to help institutions and governments measure not only the academic but also the societal and economic value of open science, moving debates beyond citation counts to real-world impact.

Setting the Standards: OSMI Principles

In a milestone for global policy coherence, Nicolas Fressengeas and Laetitia Bracco formally launched the final version of the OSMI Principles of Open Science Monitoring, a comprehensive, internationally consulted framework promoting transparency, inclusiveness, and adaptability. The principles are designed to help policymakers, funders, and infrastructures align monitoring systems with shared values while remaining adaptable to local contexts.

Scaling Up: EOSC Observatory 2.0

A standout moment came with the launch of the second phase of the EOSC Open Science Observatory by Stefan Liebler and Tereza Szybisty. This next-generation monitoring platform offers accessible, harmonised data to track open science uptake across Europe, supporting evidence-based policy at both national and continental levels.

UNESCO’s Global Vision and Regional Realities

Rania Sabo presented UNESCO’s first comprehensive global assessment under the Recommendation on Open Science, offering a panoramic view of international progress. The discussion revealed striking variations in national readiness, as well as shared barriers such as resourcing, policy alignment, and researcher incentives.

Connecting Policy to Practice: OPUS Collaboration

A joint session with the Open and Universal Science (OPUS) project brought the conversation to the institutional level. Representatives of research-performing (RPOs) and research-funding organisations (RFOs) examined the tangible challenges of translating policy frameworks into everyday research practice, grappling with data gaps, compliance burdens, and cultural resistance.

See the full report here.

Closing with a Vision for the Future

In closing remarks, Ioanna GrypariNicolas Fressengeas, and Shaofeng Hu (UNESCO) reiterated that the real challenge ahead lies not only in measuring open science, but in ensuring that these measurements guide meaningful action.

“Monitoring must be a tool for transformation,” Hu emphasised, “not a mere accounting exercise.”

Key Takeaways from Paris

  • Evidence matters — Data-driven insights are essential to justify and guide open science policies.
  • Principles and flexibility — Monitoring frameworks must balance common standards with local adaptability.
  • Beyond academia — Measuring societal and economic benefit is as crucial as scholarly impact.
  • Integration across scales — National, institutional, and global efforts must be aligned to avoid duplication and maximise relevance.

With the PathOS legacy cemented, the EOSC Observatory upgraded, and UNESCO’s global vision charted, the Paris conference left participants with both the tools and the mandate to make open science impact assessment a standard — and transformative — part of the research system.

See more:

Lesson learnt on how to better support OS policies at institutional level:  OPUS Policy Brief 
Lesson learnt on how to better support OS policies at institutional level:  OPUS Policy Brief  1024 683 Open and Universal Science (OPUS) Project

Supporting Open Science at Institutional Level

During the OPUS Final Conference held in Paris in July 2025, Juliana Chaves Chaparro, Senior Consultant at UNESCO’s Management of Social Transformations Programme (MOST), presented the OPUS Policy Brief on lessons learnt for supporting Open Science policies at institutional level. Dr Chaves Chaparro brings nearly 25 years of experience in science-policy-society interface promotion, particularly in the Global South, with an advanced degree in Environmental Sciences and ongoing PhD in Sociology.

Her presentation emphasised the importance of aligning policies across local, institutional, national, and European levels whilst integrating gender equality and diversity measures into Open Science initiatives. The policy brief concluded with recommendations for transforming barriers into policy actions, including developing coherent policies, securing resources, and supporting early career researchers and diversity.

Juliana Chaves Chaparro (UNESCO): Recommendations for transforming barriers into policy actions

Juliana Chaves Chaparro of UNESCO has been articulating a path forward for open science, calling on institutions to embrace their responsibility in breaking down barriers and turning them into real policy change. The 2021 UNESCO Recommendation on Open Science serves as a guiding vision, insisting on equity and accessibility as the very foundation of scientific practice. This commitment is echoed in the recent OPUS Policy Brief, which explores how academic and research institutions might successfully transition to open science, using lessons learned from five OPUS case studies as its compass.

The brief doesn’t shy away from the complex realities faced by institutions. It highlights that cultural, structural, and policy-related challenges continue to impede progress, and root causes must be confronted if open science is to flourish anywhere beyond declarations and good intentions. One striking challenge lies in fragmentation—policies and working groups within many institutions operate in silos, with little coordination. Some institutions, especially those participating as pilots, found themselves constrained by the mere absence of robust national policy or by the narrow mandates handed down by government ministries.

No less daunting are the resource constraints: both funding and skilled personnel are often in short supply, stalling or even derailing earnest efforts to embed open science. Resistance to change, both conscious and unconscious, is never far away. This resistance is frequently traced back to traditional markers of academic success, the familiar metrics and the comfort of the status quo.

Early career researchers and women in particular face an atmosphere of precarity and insecurity. High turnover, job insecurity, and fragile retention undermine the long-term commitment so essential to effecting lasting transformation. Further, those who do contribute to open science often find their work unrewarded in the context of career progression. This lack of recognition not only demotivates but also adds to their unpaid workload.

The challenges extend into technical and capacity gaps too, especially in the social sciences and humanities, where the ability to engage meaningfully with society is often hampered by patchy expertise and limited support. Inequities—be they structural or cultural—continue across institutions, with disparities in access, persistent gender issues, and a lack of diversity all creating further hurdles.

The OPUS case studies, however, offer a beacon of practical hope. They have shown that involving researchers, managers, and librarians from the outset, and making the process participatory and inclusive, is transformative. Policies designed with input from a diverse range of voices—across career stages, backgrounds, genders, and traditionally underrepresented groups—are demonstrably more relevant and effective.

Leadership, when visible and committed, makes a measurable difference. The allocation of real resources and open recognition of everyone’s efforts can set the groundwork for change. In some institutional contexts, the creation of open science ambassadors or champions—like those at the University of Cyprus—has ignited broader community engagement and kept open science firmly on the agenda.

In the absence of comprehensive national frameworks, interim policies developed at the institutional level have been invaluable in keeping open science alive and evolving within their own walls. Meanwhile, international exchanges and collaborations have infused these efforts with new ideas and motivation, fostering a spirit of peer support and shared endeavour.

Embedding open science practices into existing research assessment frameworks, alongside consistent capacity building and investment in critical infrastructure, marks another key lesson from the OPUS experience.

But there are still challenges that linger. Far too often, open science is reduced to just open access or a handful of extra initiatives, rather than being embraced holistically, as UNESCO recommends. Three years—the span of the OPUS project—was simply not enough to catalyse the deep-rooted transformation needed. Misalignment between national and institutional policies continues to disrupt momentum, with the risk that researchers may eventually lose interest if support and mandates are not made clear and coherent.

Gender and diversity challenges persist stubbornly as well. The lack of disaggregated data, ongoing gender bias, and inadequate support for caregivers and underrepresented groups are all formidable obstacles.

So, what does the Policy Brief urge? First, it calls for the alignment of policies from the local to the global level, ensuring coherence and clarity throughout. Adequate resourcing—funding and dedicated staff—is non-negotiable for real progress. Open science must be woven into researcher assessment, so its value is embedded in career progression and not simply an afterthought. Investment in training, infrastructure, and ongoing community engagement is essential.

Specific measures must also address the needs of early-career researchers, caregivers, and those from marginalised backgrounds. Gender equality and diversity should no longer be side issues, but mainstreamed within open science strategies and everyday practice. Centralised infrastructure for data management and monitoring is vital, supporting the accessibility and longevity of the transition. The spirit of collaboration should extend beyond borders, with international working groups and knowledge exchange becoming routine features of institutional activity.

Sustainable transformation, as Juliana Chaves Chaparro and UNESCO make clear, is not achieved through piecemeal or tokenistic efforts. Only comprehensive, coordinated action, rooted in inclusivity, gender equality, and diversity, can truly advance open science and rebuild the public’s trust in research. Institutions are therefore encouraged to answer open calls for action, to support national monitoring initiatives, and to participate fully in evolving working groups on financing, incentives, policy, and strategy. Only by advancing together, as a genuine community, does meaningful change become not just possible, but inevitable.

Reflecting on 25 Years of the European Research Area: Insights from Rinske Van Den Berg
Reflecting on 25 Years of the European Research Area: Insights from Rinske Van Den Berg 1024 683 Open and Universal Science (OPUS) Project

As the European Research Area (ERA) marks its 25th anniversary, it’s clear that the journey is far from over. At the OPUS Final Conference held in Paris in July 2025, Rinske Van Den Berg, Project Officer at the Research Executive Agency, shared her reflections on the evolving landscape of European research and the challenges that still lie ahead.

“The ERA agenda is not static. It evolves with the wishes of the Member States, who seek to further develop research, universities, research-performing organisations, and research funding organisations,” she explained. While some elements from previous ERA agendas remain, Van Den Berg emphasised that “these activities still require further implementation or alignment within the Member States. This is essential to facilitate researchers and higher education in Europe.”

Among the priorities, she highlighted two key areas: “Enabling open science and reforming research assessment remain central. We also continue to collaborate on connecting research infrastructures.” There’s a strong focus on gender equality and making research careers more attractive across the continent. “We are striving to connect activities from the European University Alliances with research and innovation, aiming to develop a fully aligned process within Europe. This involves linking local universities with regional and national authorities and connecting these to European policy decisions, forming a coherent triangle,” she said.

“There are new actions on which the ERA will focus in the coming years. You will see further developments in areas such as equity in open science, which will require more attention, and in policy and science-for-policy ecosystems. Another important area is the impact of artificial intelligence on the work and careers of researchers”, Van Den Berg noted.

None of this progress would be possible without the often-unsung heroes of research management. “All this work cannot be achieved without the support of research managers. Many of you are working hard on projects to improve the careers of research managers. From my experience in the Commission, about 80% of the people we met in meetings were not researchers themselves; they simply do not have the time. It takes a whole career to understand the policies developed in Brussels, the calls that are issued, and how to write proposals. This is not usually the work of the researcher alone, but of a dedicated team.”

“Behind every great researcher, in my view, there is a great team of people who assist, especially in matters of open science and research assessment. I am pleased that this remains a focus and is renewed on the agenda. We will continue to work together with our partners to improve the careers of research managers.”

Van Den Berg also touched on the fluidity of career paths in research. “Who in the room considers themselves a research manager rather than a researcher? I see that about 25% of you do, and that is telling. Researchers often move between roles during their careers: starting as master’s students, completing PhDs, teaching at university, perhaps taking time for family, managing teams, and sometimes returning to research. There is an overlap between the roles of researcher and research manager within individuals, and we must strive to make flexible careers possible, supported by the right assessment systems.”

On the policy front, she spoke about ongoing efforts to develop the AOS Federation, which will “enable the exchange and reuse of data, increasing the amount and productivity of FAIR research data in Europe. It is important that results are accessible, archived, and reusable so we do not reinvent the wheel.”

Copyright and data ownership are also on the agenda. “The ERA policy agenda also seeks to develop ideas on copyright, particularly as it relates to open data. Researchers should be able to own their research where necessary. The impact of open science policies and practices on platforms such as the Open Science Intelligence Platform is also being considered.”

Van Den Berg identified the reform of research assessment and open science as the two main topics in the current ERA policy agenda. “The focus for OPUS partners in the coming decade will be on these areas, especially as we approach the final years of Horizon Europe and move towards Framework Programme 10. We are supporting and mapping changes in national frameworks and connecting them with individual research institutions. It is important not only to develop policy but also to ensure real connections between institutional, national, and European developments.”

She concluded by stressing the importance of policy input from the research community: “Policy is developed from all regions and institutions involved in research, innovation, and assessment. We want to hear from projects like yours about your evidence, what you have demonstrated, and what measures you would like us to consider in future work programmes.”

“One of the aims when we started this project was to be practical, ensuring the policy brief would help us draw together all the threads and learnings so we can provide feedback to you and our colleagues at the Commission and UNESCO.”

As the ERA continues to evolve, it’s clear that collaboration, adaptability, and input from across the research community will remain vital to shaping the future of European science.

Privacy Preferences

When you visit our website, it may store information through your browser from specific services, usually in the form of cookies. Our Privacy Policy can be read here.

Here you can change your Privacy preferences. It is worth noting that blocking some types of cookies may impact your experience on our website and the services we are able to offer.

Click to enable/disable Google Analytics tracking code.
Click to enable/disable Google Fonts.
Click to enable/disable Google Maps.
Click to enable/disable video embeds.
Our website uses cookies, mainly from 3rd party services. Define your Privacy Preferences and/or agree to our use of cookies.