OPUS News

Researcher Assessment Framework (RAF) presented at the OPUS Final Conference
Researcher Assessment Framework (RAF) presented at the OPUS Final Conference 1024 683 Open and Universal Science (OPUS) Project

OPUS Researcher Assessment Framework (RAF)

New Approach to Research Evaluation

The centrepiece of the OPUS Final Conference was the presentation of the OPUS Researcher Assessment Framework (RAF) by Gareth O’Neill from Technopolis Group Belgium (TGB) and Emma Day from Vitae. O’Neill, Principal Consultant on Open Science at Technopolis Group, specialises in the implementation of Open Science across Europe, particularly focusing on sustainable careers for researchers and research assessment reform.

Gareth O’Neill (TGB) on the OPUS Researcher Assessment Framework (RAF)

One of the main focuses was developing a framework of indicators to incentivise and reward open science. However, as we examined these indicators, it became clear that they are only one piece of the puzzle. Open science activities are not separate from other research activities; researchers are never solely engaged in open science, nor are they rewarded only for that. Therefore, we had to take a step back and consider the full range of activities researchers undertake, what they are currently assessed for, and what we would like them to be assessed for in the future—whether in project applications, job applications, or during ongoing employment. The open science component should play a major role, but it is not the only aspect.

We have not worked in isolation; there have been many steps to reach this point, and there are ongoing efforts. Comments on assessment and open science were present in the original Charter and Code for Researchers, but the revised version is more explicit. We have considered the DORA principles, the Leiden Manifesto for research metrics, and the work of the Open Science Policy Platform. Many colleagues, including Rinska and Rene, contributed to these efforts, particularly on indicators for research assessment.

There are numerous principles, such as the Hong Kong and COARA principles, so the question is which ones to use. The Coalition for Advancing Research Assessment is a major initiative aiming to reform research assessment and align organisations in a common direction. With over 750 members, it is challenging to reach agreement, but shared goals exist. We have monitored these principles and many available indicators as we developed our framework.

We established our own set of principles for the framework. We did not delve deeply into organisational values and principles, as these are usually already defined. Our focus was to provide indicators to support assessment and open science, essentially creating a toolbox. Our aim was to develop a comprehensive set of indicators, enough to capture all researcher activities, but not so many as to become unmanageable. We sought a balance, avoiding an overwhelming list of indicators.

The framework is designed to be applicable across countries, disciplines, and types of organisations, whether research-performing or research-funding. It is agnostic to context and can be applied at various levels. Importantly, we are not dealing with metrics per se, but with indicators that can be used qualitatively or quantitatively. Organisations are free to determine how to deploy them; we believe assessment should combine responsible quantitative metrics with clear qualitative guidance.

Our focus is predominantly on the assessment of individual researchers, not research quality or peer review. However, as research funding organisations often work on a project or principal investigator basis, the framework is also scalable to projects. It is not intended for monitoring large groups, faculties, or entire universities, though it could be adapted for such purposes if desired.

We sought to capture all main activities undertaken by researchers, not just publications or publication metrics such as journal impact factors, citations, or the h-index. While these remain important, we want to provide organisations with more options.

The research assessment framework considers all researcher activities, with a specific lens for open science, two sides of the same coin. Indicators are separated into processes, outputs, and outcomes. A researcher may be at different stages: undertaking an activity (process), producing a tangible result (output), or achieving short- to medium-term outcomes. We do not focus on long-term impact.

Indicators are phrased to be clear yet flexible, allowing organisations to interpret and adapt them. Organisations can select, refine, merge, or subdivide indicators as needed. We do not prioritise indicators; any ranking or weighting is left to the organisation. The result is a flexible framework.

Our final version identifies four key domains of researcher activities, tested with researchers:

  1. Research: Developing proposals, methods, methodologies, protocols, data (planning, collection, management, curation, archiving), software, publications, materials (laboratory specimens, artistic objects), and reviews (publications, data, proposals).
  2. Education: Developing course programmes, resources, teaching, supervision (bachelor’s, master’s, PhD students), and skills development.
  3. Leadership: Demonstrating leadership or management (staff, projects), holding organisational roles (team leader, institute director), participating in committees, advisory boards, panels, and receiving awards.
  4. Valorisation: Science communication and popularisation (written, physical, online, recorded), engagement (industry, public sector), exploitation of research results (intellectual property), and entrepreneurship.

The framework includes two cross-cutting dimensions:

  • Openness: The Open Science Career Assessment Matrix (OSCAM 2) lenses all activities for open science.
  • Teamwork/Collaboration: Recognising that most research is collaborative, though we have not created a separate framework for teamwork.

Equality, diversity, and inclusion are overarching considerations, though not directly embedded in the indicators.

For example, in the domain of data, the process may involve developing a data management plan. Outputs are the completed plan or dataset. Outcomes include implementation, publication in repositories or journals, and indicators such as access, citation, or reuse. These can be measured quantitatively or qualitatively. Whether these steps are open depends on institutional policy.

These indicators are intended as building blocks; organisations already have frameworks and systems in place. Our indicators and ideas can help them revise and reform their systems to be more comprehensive and inclusive of open science. Deployment is left to each organisation.

Emma Day (Vitae) on the interventions

Our task was to consider interventions for implementing the framework. There are many principles and frameworks in this area, but they mean little without action.

We debated what interventions were needed and, in the end, developed ten general interventions for any organisation looking to implement the frameworks. These are starting points to consider before implementing either OSCAM2 or the RAF:

  1. Collect existing best practices in researcher assessment reform, highlighting existing initiatives and commitments.
  2. Ensure researchers understand the reasons for reforming researcher assessment.
  3. Ensure researchers, support staff, and policymakers are aware of the RAF and its implications, without overburdening them.
  4. Approve piloting or adoption of the RAF at the highest organisational level.
  5. Develop policies and procedures to implement, monitor, and evaluate the RAF, recognising that initial indicators may need adjustment.
  6. Develop or utilise tools and platforms to support RAF implementation, assessing whether existing infrastructure suffices.
  7. Ensure the RAF does not overburden researchers with additional tasks or reporting requirements.
  8. Revise organisational policies and procedures based on lessons learned, embedding the framework into broader policy.
  9. Engage policymakers to define and revise national policies on research assessment.

We identified four categories of specific interventions: policy, awareness, training, and resources.

Initially, we created a lengthy document linking interventions to every indicator, but pilot feedback indicated this was too burdensome. Instead, we now recommend policies and training that encompass all indicators at a higher level.

At the policy level, the process is to approve piloting or adoption of indicators and interventions at the highest level, define what will be piloted or adopted (as definitions vary between institutions), develop policies and procedures, and then monitor and evaluate.

Resource interventions concern the people and infrastructure required, financial input, platforms, repositories, and staff. Without adequate resources, implementation will not succeed.

Awareness interventions address whether people understand why these changes are happening, how to make them happen, and whom to approach for assistance. It is important that all relevant staff, not just researchers, are aware.

Training interventions ensure people know what to do, how to do it, and why. Training may be linked to frameworks such as ResearchComp or the Researcher Development Framework, and should focus on developing the necessary skills and competencies.

Gareth O’Neill (TGB) on work with pilot organisations

Our framework was tested by pilot organisations who did significant work. We had three research-performing organisations (Nova University of Lisbon, University of Cyprus, and University of Rijeka) and two national funding agencies: one in Romania UEFISCDI and the Research Council of Lithuania. Each created cohorts of researchers, selected indicators from the framework, and deployed them with their own interventions.

Within universities, implementation varied by institute. For example, at Nova, the framework was used in the Global Health and Tropical Medicine Research Centre and the Information Management Research Centre, which have different evaluation criteria and focuses. This demonstrates the need for flexibility. In Cyprus, it was piloted at the KIOS Research and Innovation Centre of Excellence; at Rijeka, at the Faculty of Law. The funding agencies applied it to specific programmes, which are complex and evaluated over many years. For example, one agency mandated data management plans and assessed how to implement these fairly and openly.

OPUS Final Conference: New Approach to Research Evaluation
OPUS Final Conference: New Approach to Research Evaluation 1024 683 Open and Universal Science (OPUS) Project

The OPUS (Open and Universal Science) Consortium held its Final Conference at the UNESCO International Institute of Educational Planning (IIEP) in Paris on 9-10 July 2025, marking the culmination of a transformative journey to reform research assessment systems across Europe and beyond. This two-day event brought together researchers, policymakers, and institutional leaders to shape the future of Open Science and responsible research practices.

Welcome Remarks and Opening Perspectives

The conference started with welcome remarks from three distinguished leaders who established the importance of inclusive, responsible research ecosystems. 

Corina Moya-Falcón, Senior R&D Project Manager at PLOCAN and overall Coordinator of the OPUS consortium, formally welcomed delegates to the closing event of the Open and Universal Science (OPUS) project. Her speech opened the first thematic block of the day and provided an authoritative recap of the consortium’s journey, achievements, and next steps.

Corina Moya-Falcón, OPUS Coordinator

Over the past three years, the OPUS consortium has worked tirelessly to rethink how research and researchers are assessed, promoting practices that are more open, transparent and responsible.

What began as an ambitious initiative has become a solid, evidence-based contribution to shaping the future of research assessment in Europe. Our general objective has been to coordinate and support measures that reform the evaluation of research outputs and careers within research-performing and research-funding organisations. To achieve this goal, we have brought together a diverse group of partners: universities, funding agencies, research institutes and specialist expert bodies, from across the continent.

The project is organised around seven interlocking work packages, each addressing a critical facet of the open-science ecosystem. Work Package 1 mapped the state of the art, while Packages 2, 3 and 4 designed and tested interventions, defined indicators and metrics, and established robust monitoring processes within our pilot institutions.

Gustavo Merino, Director of UNESCO’s Social Policies Division, delivered keynote remarks positioning OPUS’s work within the global movement for equitable knowledge sharing. 

Dr Merino, who has served as Director of Social Policies at UNESCO since January 2024, brought his extensive background in strategic planning, policy formulation, and international development to the conference. Previously serving in senior positions within the Mexican government, including Deputy Minister of Social and Human Development, and as Director of the Investment Center Division at the Food and Agriculture Organization (FAO), Merino emphasised UNESCO’s longstanding commitment to supporting Open Science as a driver of sustainable development and social progress.

Reflecting on 25 Years of the European Research Area: Insights from Rinske Van Den Berg

As the European Research Area (ERA) marks its 25th anniversary, it’s clear that the journey is far from over. Rinske Van Den Berg, Project Officer at the Research Executive Agency, shared her reflections on the evolving landscape of European research and the challenges that still lie ahead.

Rinske Van Den Berg, Project Officer at the Research Executive Agency (REA)

“The ERA agenda is not static. It evolves with the wishes of the Member States, who seek to further develop research, universities, research-performing organisations, and research funding organisations,” she explained. While some elements from previous ERA agendas remain, Van Den Berg emphasised that “these activities still require further implementation or alignment within the Member States. This is essential to facilitate researchers and higher education in Europe.”

Among the priorities, she highlighted two key areas: “Enabling open science and reforming research assessment remain central. We also continue to collaborate on connecting research infrastructures.” There’s a strong focus on gender equality and making research careers more attractive across the continent. “We are striving to connect activities from the European University Alliances with research and innovation, aiming to develop a fully aligned process within Europe. This involves linking local universities with regional and national authorities and connecting these to European policy decisions, forming a coherent triangle,” she said.

Looking ahead, Van Den Berg noted, “There are new actions on which the ERA will focus in the coming years. You will see further developments in areas such as equity in open science, which will require more attention, and in policy and science-for-policy ecosystems, as mentioned by Gustavo Merino of UNESCO. Another important area is the impact of artificial intelligence on the work and careers of researchers.”

She was quick to point out that none of this progress would be possible without the often-unsung heroes of research management. “All this work cannot be achieved without the support of research managers. Many of you are working hard on projects to improve the careers of research managers. From my experience in the Commission, about 80% of the people we met in meetings were not researchers themselves; they simply do not have the time. It takes a whole career to understand the policies developed in Brussels, the calls that are issued, and how to write proposals. This is not usually the work of the researcher alone, but of a dedicated team.”

“Behind every great researcher, in my view, there is a great team of people who assist, especially in matters of open science and research assessment. I am pleased that this remains a focus and is renewed on the agenda. We will continue to work together with our partners to improve the careers of research managers.”

Van Den Berg also touched on the fluidity of career paths in research. “Who in the room considers themselves a research manager rather than a researcher? I see that about 25% of you do, and that is telling. Researchers often move between roles during their careers: starting as master’s students, completing PhDs, teaching at university, perhaps taking time for family, managing teams, and sometimes returning to research. There is an overlap between the roles of researcher and research manager within individuals, and we must strive to make flexible careers possible, supported by the right assessment systems.”

On the policy front, she spoke about ongoing efforts to develop the AOS Federation, which will “enable the exchange and reuse of data, increasing the amount and productivity of FAIR research data in Europe. It is important that results are accessible, archived, and reusable, so we do not reinvent the wheel.”

Copyright and data ownership are also on the agenda. “The ERA policy agenda also seeks to develop ideas on copyright, particularly as it relates to open data. Researchers should be able to own their research where necessary. The impact of open science policies and practices on platforms such as the Open Science Intelligence Platform is also being considered.”

Van Den Berg identified the reform of research assessment and open science as the two main topics in the current ERA policy agenda. “The focus for OPUS partners in the coming decade will be on these areas, especially as we approach the final years of Horizon Europe and move towards Framework Programme 10. We are supporting and mapping changes in national frameworks and connecting them with individual research institutions. It is important not only to develop policy but to ensure real connections between institutional, national, and European developments.”

She concluded by stressing the importance of policy input from the research community: “It is very important for us, and especially for my colleagues in the Commission, to receive your policy input. I could not have presented these slides without your contributions. Policy is developed from all regions and institutions involved in research, innovation, and assessment. We want to hear from projects like yours about your evidence, what you have demonstrated, and what measures you would like us to consider in future work programmes. I appreciate that your policy briefs are integrated into your communication materials for OPUS.”

“One of the aims when we started this project was to be practical, ensuring the policy brief would help us draw together all the threads and learnings, so we can provide feedback to you and our colleagues at the Commission and UNESCO.”

As the ERA continues to evolve, it’s clear that collaboration, adaptability, and input from across the research community will remain vital to shaping the future of European science.

Main Outcomes and Key Exploitable Results

OPUS Researcher Assessment Framework (RAF)

New Approach to Research Evaluation

The centrepiece of the conference was the presentation of the OPUS Researcher Assessment Framework (RAF) by Gareth O’Neill from Technopolis Group Belgium (TGB) and Emma Day from Vitae. O’Neill, Principal Consultant on Open Science at Technopolis Group, specialises in the implementation of Open Science across Europe, particularly focusing on sustainable careers for researchers and research assessment reform.

Gareth O’Neill from Technopolis Group Belgium (TGB) and Emma Day from Vitae

Gareth O’Neill (TGB) on the OPUS Researcher Assessment Framework (RAF)

One of the main focuses was developing a framework of indicators to incentivise and reward open science. However, as we examined these indicators, it became clear that they are only one piece of the puzzle. Open science activities are not separate from other research activities; researchers are never solely engaged in open science, nor are they rewarded only for that. Therefore, we had to take a step back and consider the full range of activities researchers undertake, what they are currently assessed for, and what we would like them to be assessed for in the future—whether in project applications, job applications, or during ongoing employment. The open science component should play a major role, but it is not the only aspect.

We have not worked in isolation; there have been many steps to reach this point, and there are ongoing efforts. Comments on assessment and open science were present in the original Charter and Code for Researchers, but the revised version is more explicit. We have considered the DORA principles, the Leiden Manifesto for research metrics, and the work of the Open Science Policy Platform. Many colleagues, including Rinska and Rene, contributed to these efforts, particularly on indicators for research assessment.

There are numerous principles, such as the Hong Kong and COARA principles, so the question is which ones to use. The Coalition for Advancing Research Assessment is a major initiative aiming to reform research assessment and align organisations in a common direction. With over 750 members, it is challenging to reach agreement, but shared goals exist. We have monitored these principles and many available indicators as we developed our framework.

We established our own set of principles for the framework. We did not delve deeply into organisational values and principles, as these are usually already defined. Our focus was to provide indicators to support assessment and open science, essentially creating a toolbox. Our aim was to develop a comprehensive set of indicators, enough to capture all researcher activities, but not so many as to become unmanageable. We sought a balance, avoiding an overwhelming list of indicators.

The framework is designed to be applicable across countries, disciplines, and types of organisations, whether research-performing or research-funding. It is agnostic to context and can be applied at various levels. Importantly, we are not dealing with metrics per se, but with indicators that can be used qualitatively or quantitatively. Organisations are free to determine how to deploy them; we believe assessment should combine responsible quantitative metrics with clear qualitative guidance.

Our focus is predominantly on the assessment of individual researchers, not research quality or peer review. However, as research funding organisations often work on a project or principal investigator basis, the framework is also scalable to projects. It is not intended for monitoring large groups, faculties, or entire universities, though it could be adapted for such purposes if desired.

We sought to capture all main activities undertaken by researchers, not just publications or publication metrics such as journal impact factors, citations, or the h-index. While these remain important, we want to provide organisations with more options.

The research assessment framework considers all researcher activities, with a specific lens for open science, two sides of the same coin. Indicators are separated into processes, outputs, and outcomes. A researcher may be at different stages: undertaking an activity (process), producing a tangible result (output), or achieving short- to medium-term outcomes. We do not focus on long-term impact.

Indicators are phrased to be clear yet flexible, allowing organisations to interpret and adapt them. Organisations can select, refine, merge, or subdivide indicators as needed. We do not prioritise indicators; any ranking or weighting is left to the organisation. The result is a flexible framework.

Our final version identifies four key domains of researcher activities, tested with researchers:

  1. Research: Developing proposals, methods, methodologies, protocols, data (planning, collection, management, curation, archiving), software, publications, materials (laboratory specimens, artistic objects), and reviews (publications, data, proposals).
  2. Education: Developing course programmes, resources, teaching, supervision (bachelor’s, master’s, PhD students), and skills development.
  3. Leadership: Demonstrating leadership or management (staff, projects), holding organisational roles (team leader, institute director), participating in committees, advisory boards, panels, and receiving awards.
  4. Valorisation: Science communication and popularisation (written, physical, online, recorded), engagement (industry, public sector), exploitation of research results (intellectual property), and entrepreneurship.

The framework includes two cross-cutting dimensions:

  • Openness: The Open Science Career Assessment Matrix (OSCAM 2) lenses all activities for open science.
  • Teamwork/Collaboration: Recognising that most research is collaborative, though we have not created a separate framework for teamwork.

Equality, diversity, and inclusion is an overarching consideration, though not directly embedded in the indicators.

For example, in the domain of data, the process may involve developing a data management plan. Outputs are the completed plan or dataset. Outcomes include implementation, publication in repositories or journals, and indicators such as access, citation, or reuse. These can be measured quantitatively or qualitatively. Whether these steps are open depends on institutional policy.

These indicators are intended as building blocks; organisations already have frameworks and systems in place. Our indicators and ideas can help them revise and reform their systems to be more comprehensive and inclusive of open science. Deployment is left to each organisation.

Emma Day (Vitae) on the interventions

Our task was to consider interventions for implementing the framework. There are many principles and frameworks in this area, but they mean little without action.

We debated what interventions were needed and, in the end, developed ten general interventions for any organisation looking to implement the frameworks. These are starting points to consider before implementing either OSCAM2 or the RAF:

  1. Collect existing best practices in researcher assessment reform, highlighting existing initiatives and commitments.
  2. Ensure researchers understand the reasons for reforming researcher assessment.
  3. Ensure researchers, support staff, and policymakers are aware of the RAF and its implications, without overburdening them.
  4. Approve piloting or adoption of the RAF at the highest organisational level.
  5. Develop policies and procedures to implement, monitor, and evaluate the RAF, recognising that initial indicators may need adjustment.
  6. Develop or utilise tools and platforms to support RAF implementation, assessing whether existing infrastructure suffices.
  7. Ensure the RAF does not overburden researchers with additional tasks or reporting requirements.
  8. Revise organisational policies and procedures based on lessons learned, embedding the framework into broader policy.
  9. Engage policymakers to define and revise national policies on research assessment.

We identified four categories of specific interventions: policy, awareness, training, and resources.

Initially, we created a lengthy document linking interventions to every indicator, but pilot feedback indicated this was too burdensome. Instead, we now recommend policies and training that encompass all indicators at a higher level.

At the policy level, the process is to approve piloting or adoption of indicators and interventions at the highest level, define what will be piloted or adopted (as definitions vary between institutions), develop policies and procedures, and then monitor and evaluate.

Resource interventions concern the people and infrastructure required, financial input, platforms, repositories, and staff. Without adequate resources, implementation will not succeed.

Awareness interventions address whether people understand why these changes are happening, how to make them happen, and whom to approach for assistance. It is important that all relevant staff, not just researchers, are aware.

Training interventions ensure people know what to do, how to do it, and why. Training may be linked to frameworks such as ResearchComp or the Researcher Development Framework, and should focus on developing the necessary skills and competencies.

Gareth O’Neill (TGB) on work with pilot organisations

Our framework was tested by pilot organisations who did significant work. We had three research-performing organisations (Nova University of Lisbon, University of Cyprus, and University of Rijeka) and two national funding agencies: one in Romania UEFISCDI and the Research Council of Lithuania. Each created cohorts of researchers, selected indicators from the framework, and deployed them with their own interventions.

Within universities, implementation varied by institute. For example, at Nova, the framework was used in the Global Health and Tropical Medicine Research Centre and the Information Management Research Centre, which have different evaluation criteria and focuses. This demonstrates the need for flexibility. In Cyprus, it was piloted at the KIOS Research and Innovation Centre of Excellence; at Rijeka, at the Faculty of Law. The funding agencies applied it to specific programmes, which are complex and evaluated over many years. For example, one agency mandated data management plans and assessed how to implement these fairly and openly.

Work with Pilot Organisations through Pilot Action and Mutual Learning

Raquel Vega from the Young European Research Universities Network (YERUN) introduced the pilot action segment, framing it as an opportunity for mutual learning and practical experimentation. Vega, who serves as Policy and Project Officer at YERUN, emphasised the collaborative effort that characterised the OPUS pilots and the importance of context-specific solutions.

Raquel Vega, Policy and Project Officer at YERUN

The pilot programme involved five organisations, three Research Performing Organisations (RPOs) and two Research Funding Organisations (RFOs), who tested the RAF through tailored action plans and co-created strategies for meaningful reform. Each pilot organisation brought unique perspectives and challenges to the implementation process.

Raquel Vega: Pilots have done all the work within this work package, working extremely hard

The main goal of WP4 was to design and implement action plans to pilot selected elements of the Research Assessment Framework (RAF), as well as the Open Science Career Assessment Matrix 2 (OSCAM2), through their accompanying interventions in five organisations. In addition, we organised mutual learning sessions to draw out results and lessons learned.

The main protagonists of WP4 are the pilot organisations themselves. They have done all the work within this work package, working extremely hard. We had three universities and two funding organisations participating as pilots.

Our work followed three main lines. First, we designed the action plans. Second, we implemented and coordinated these action plans over an 18-month period. Third, we collected lessons learned from mutual learning sessions held throughout these 18 months.

To begin with the design and implementation of the action plans, the pilots had to define, very early on, their cohorts and the units or funding programmes involved. Together with the leaders of Work Packages 2 and 3, we selected the indicators and interventions that best fit each pilot, both from a strategic and feasibility perspective. It is important to mention that, in Work Packages 2 and 3, we also carried out a baseline audit to determine the starting point for each pilot.

The full implementation period lasted 18 months. We began with the first version of the action plans, which we started implementing in January 2024. The pilots ran until June 2025. During the first nine months, we observed what was working and what was not, and we collected those learnings into a revised version of the action plans, which was ready by September 2024. This marked the end of the first stage of implementation.

In addition to the individual interventions chosen by each pilot, we also introduced some common interventions for all pilots. These focused on the links between open science and trust, open science and industry, and open science and gender equality.

Regarding mutual learning, the key process was the bi-monthly mutual learning sessions, which took place both online and in person. These sessions were essential for collecting challenges and recommendations from each pilot organisation regarding their implementation experiences.

From January 2024 until approximately September 2025, we collected all these learnings and, by the end of November 2025, produced the first version of the Mutual Learning collection of recommendations and challenges. In the following nine months, the second stage of implementation, we added further learnings from this later period. We are currently working on the final mutual learning collection of lessons learned.

We monitored progress through a process we called co-monitoring, which we conducted together with the pilots. During the first part of the implementation period, the first nine months, pilots submitted monthly reports. During the second part, reports were submitted bi-monthly. These reports were reviewed by our partner Jisc, led by our colleague Helen Clare. The reports were also instrumental for the mutual learning sessions, as they informed the guiding questions for the various sessions we held.

Executive Agency for Higher Education, Research, Development and Innovation Funding (UEFISCDI) – Pilot Action (Romania)

Alina Irimia, Open Science Projects Coordinator at the Executive Agency for Higher Education, Research, Development and Innovation Funding (UEFISCDI), presented Romania’s comprehensive approach to Open Science implementation. Irimia showcased UEFISCDI’s strategic alignment with Romania’s National Open Science Strategy, including the implementation of FAIR Data Management Plans.

Alina Irimia, Open Science Projects Coordinator at the Executive Agency for Higher Education, Research, Development and Innovation Funding (UEFISCDI), Romania

We tested the OPUS Research Assessment Framework and selected one indicator to implement and test. Ultimately, this turned into an actual implementation, specifically around introducing Data Management Plans (DMPs) into a major national funding instrument in Romania.

The timing of the project was very fortunate, as it coincided with the need to implement a national strategy on open science. These are the first steps in putting that strategy into practice, which states that DMPs should be mandatory for all publicly funded research.

We tested this requirement within a major funding programme, which has a substantial budget of 320 million euros. This programme is being launched for the first time in Romania. The evaluation phase has just concluded, and we expect to fund approximately 2,024 projects. These projects will have a duration of four to five years, with the possibility of extension by up to two years.

The main interventions focused on implementing these new requirements and evaluation criteria within this funding instrument. We had to introduce these requirements into the information packages for the funding instrument, which raised several challenges. At the institutional level, we were not accustomed to such requirements, and our staff were not fully prepared to implement open science-related obligations.

We needed to raise awareness among our staff, management, and also the Ministry. Although the Ministry had approved the national strategy, it was necessary to inform them about how these requirements would affect funding streams. As a national funding agency, we are not fully autonomous and depend on ministerial decisions regarding funding, so it was essential to have their support for implementation.

There were both policy and resource-related interventions. For example, we needed to develop or adapt a DMP template. In the end, we used the Horizon Europe template and the Science Europe guidelines, adapting them to our context. We provided resources to support both researchers and staff, including guidelines, a blueprint for researchers, and a guideline, translated from the Science Europe guideline, for staff to use when reviewing DMPs. We also provided examples of open DMPs to support researchers, who, to be honest, are already overburdened by this new requirement. Considerable awareness-raising was needed.

We organised a national conference on research assessment and open science, held institutional awareness sessions, and arranged hands-on meetings with staff to discuss specific challenges related to these new requirements. In June, we also held a nationwide webinar for researchers on DMPs. This training session attracted many engaged participants, demonstrating both interest and a need for support.

The indicator we achieved in OPUS was the development of FAIR DMPs. However, regarding the DMPs actually produced in the successful projects, we will only have those results after the projects commence, as the evaluation has only just concluded.

The key outcome of this pilot is that we have moved from testing to full implementation within a major funding instrument. The awareness-raising efforts have been a significant success, leading to the extension of these requirements, and more broadly, open science requirements, to other funding instruments. We are now discussing making these practices and requirements standard for future funding programmes.

There were several challenges. Integrating these requirements for the first time led to changes in our internal processes. We needed to raise awareness and work hands-on to update processes, including IT updates and discussions about future IT needs. There was also a challenge in understanding the open science indicator itself. My institution is fortunate to have an open science team, but many funders do not have such expertise. We supported staff in understanding open science and coordinated with other departments. We also had to understand the funding streams and the impact of open science requirements on these streams.

Policy and process integration presented further challenges, particularly in securing decisions to implement the changes. There was cultural resistance from policymakers, stakeholders, and researchers. There is a lack of capacity and skills related to open science and FAIR DMPs, and we still need to work on building institutional capacity and providing support and training sessions for researchers.

Staff upskilling is essential, as is providing tools and training for researchers regarding DMP adoption. The availability of European-level guidelines and templates, such as those from Science Europe, was very helpful, and we simply adapted them to our context.

A major challenge was coordinating timelines between project requirements and the timelines of the funding streams in our national programme. Delays occurred that were beyond our control, and we had only 18 months to complete all this work. However, these efforts will continue beyond the project’s duration.

Research Council of Lithuania (RCL) Pilot Action

Julija Baniukevic from the Research Council of Lithuania (RCL) presented an exemplary community-driven approach to Open Science implementation. Dr Baniukevic, who serves as OPUS Project coordinator in Lithuania, has extensive experience in bionanotechnology research, policy advocacy, and European science cooperation.

The Lithuanian pilot implemented a comprehensive training programme over nine months, reaching 21 principal investigators from various research fields who subsequently trained 211 researchers in their teams. The programme achieved remarkable engagement levels, with 91% attending on-site training and 20 out of 21 trainers conducting sessions for their teams. Participants showed strong commitment, with 68.2% of research outputs made openly accessible and 82.6% of participants initiating open science dialogues in their institutions.

Julija Baniukevic (RCL): Why we achieved such good results in just nine months?

When we were reorganising our approach, we had only nine months to achieve our goals, so we had to consider carefully at which level to act. We decided to focus on principal investigators (PIs) and selected one call with 136 winners. All were invited to participate; 46 expressed interest and attended the introductory session in September. Out of these, 26 began the programme, 23 continued throughout the year, and 19 completed the full nine-month programme and received certificates.

I was pleased that six Lithuanian institutions participated: four universities and two research institutes. We covered almost all research fields, with psychology representing about 18% of the participants, which was particularly interesting, though other fields were also well represented.

Julija Baniukevic, Research Council of Lithuania (RCL)

Recognising that you cannot achieve anything alone, I formed a team with support from our vice-chair, Vaiva Prudokinia, and established an Advisory Board within the Research Council. I was joined by three colleagues, Justina, Agla, and Girinta, from different departments: the Research and Business Cooperation Unit, the Researcher Support and EU Investment Unit, and the Research and Higher Education Policy Analysis Unit. Having diverse perspectives was crucial, and without these colleagues, our achievements would not have been possible.

Let me turn to our results. Our researchers, along with others trained during the OPUS pilot, participated in a nine-month programme. We selected five indicators: two in education, two in research, and one in valorisation.

Starting with education, we identified a common misunderstanding among researchers, who often equate open science solely with open access. We wanted to broaden their understanding, so we began with courses on open science. Initially, I was unsure whether these would be well received, but 91% of participants attended the on-site training, which was a pleasant surprise and made me proud of our researchers.

Over two days, we covered many topics, which was a great success and helped us build stronger connections with participants. After the initial training, participants received certificates as trainers. For the next four months, these trainers delivered training sessions to their own teams and laboratories. In total, 21 PIs trained 211 researchers, sharing knowledge and experiences. Only one person did not deliver the training, as she had not completed the full programme.

This group of researchers was enthusiastic about sharing their knowledge. Three trainers even organised a full-day conference on open science, which attracted 70 participants. We also involved one trainer in the Open Data Forum, organised by my colleague Girinta, where she represented both the OPUS project and open science in panel discussions. At our final event in June, eight cohort members shared their experiences, initiatives, and perspectives on open science, including the challenges they encountered.

These activities were a significant success, but we also identified some challenges. For example, some researchers mentioned a lack of compensation, so we are considering mechanisms for micro-rewards. There is also a lack of institutional recognition for public engagement activities, which we hope will improve. Limited understanding of the open science concept was evident, but over the nine months of our ambassador programme, we saw that researchers are eager to develop new skills and adapt open science practices to their own systems.

We also observed that personnel changes can impact pilot implementation, which is something to consider in future projects.

Turning to research indicators, we monitored the number of openly available publications. We had many discussions about what constitutes open access, including whether embargoes should be counted. In total, the cohort produced 85 publications, of which 58 (68%) were openly available. Interestingly, about 42% of researchers published exclusively in open access, while others published more than half of their work openly, though some published less. This is an interesting result, even if our sample size is small.

The fifth indicator was openly available data management plans (DMPs). We prepared recommendations for researchers, which will be added to our website. In collaboration with the Ombudsperson and her team, especially Reda, we analysed what needs improvement and how to enhance our management plans. All ten researchers agreed to make their DMPs openly available, and by the end of July, these will be published on the Research Council’s website in a dedicated section.

However, we still face issues. For example, when researchers submit proposals to the Research Council, they are required to include a DMP, but there is currently no follow-up on quality and implementation. We are working on how to address this, especially now that researchers understand the value of DMPs.

Dedicated funding for open science tasks would be very beneficial. Some researchers still view open science as additional bureaucracy, and there can be a disconnect between open science and researchers’ daily work. We need to demonstrate the advantages and relevance of open science more clearly.

Recognition and rewards are important. At the start, researchers wanted to know what they would receive for their efforts. We awarded certificates for trainers and for completing the nine-month ambassador programme. We also wanted to give them a sign of their ambassador status, but internal bureaucracy has delayed this. Nevertheless, participants gained visibility, and they appreciated small gestures such as homemade cakes and personalised awards.

Reflecting on why we achieved such good results in just nine months, I believe it was due to a clear action plan, a strong team, relevant topics, and an engaged community, all supported by the RCL administration and leadership. We see that an open science culture is beginning to take shape in Lithuania. Researchers themselves are now engaging in constructive dialogue and helping to shape open science policy at the national level. Having RCL experts from different departments was a strategic and crucial step, and the OPUS community has become a key driver of change in open science within the Research Council and across Lithuania.

University of Rijeka Centre for Research and Innovation (UNIRI) Pilot Action

Nataša Jakominić Marot, Head of the University of Rijeka Centre for Research and Innovation (UNIRI), presented the Croatian pilot’s comprehensive approach to Open Science integration. With over two decades of experience in driving innovation and research within academia, Jakominić Marot coordinates teams to achieve impactful results whilst contributing to meaningful societal change.

The University of Rijeka pilot focused on implementing Open Science initiatives across research centres, emphasising practical applications in data management, software development, and publication practices. The university’s approach included developing an Open Science Guide, designating library staff as reference points, and selecting specific repositories for different research outputs. While achieving progress in areas like open access publications and software sharing, the pilot encountered challenges including delays in national policies and difficulty retaining early career researchers.

Nataša Jakominić Marot (UNIRI): We built on existing infrastructure, such as the newly established centres

At the University of Rijeka, we have open science experts at the university library, librarians and research managers, whom we consult when designing or implementing open science activities. These experts also contributed to the development of our open science policy. The university library has a dedicated unit, the Centre for Open Science and Information Data Management, which was crucial for OPUS and other initiatives. Slightly before the project, we established the Centre for Science Outreach, which played an important role in valorisation activities. We wanted to boost these activities and make them more central within the university.

Our pilot cohort comprised 17 early career researchers based at the Faculty of Law, focusing on social sciences. Although the pilot was designed for this specific group, all activities, except those provided specifically by the librarian at the Faculty of Law, were open to any early career or other stage researchers at the University of Rijeka. We saw no need to restrict participation, but when measuring success and tailoring activities, we focused on our project cohort.

The key actions we undertook were aligned with the four domains presented by Gareth and Emma: research, education, valorisation, and leadership. Our activities did not cover leadership. Some activities were relevant to more than one domain, and I will highlight what the three units, the Faculty of Law, the Science Outreach Centre, and the University Library Centre for Open Science, implemented. The circular symbol on our materials indicates activities that will be sustained in the future, as they have generated significant interest and will become regular university activities.

Nataša Jakominić Marot, Head of the University of Rijeka Centre for Research and Innovation (UNIRI)

The Centre for Open Science at the University Library began organising Open Science Cafés. We have held six so far, with a seventh planned soon. These cafés address topics requested by early career researchers, and are delivered by our librarians or external experts. We decided to hold these events online, based on feedback from the cohort, to save travel time and allow greater participation. This was a change introduced during the project.

We also provide twice-yearly training on open science skills, which will continue in the future. These comprehensive sessions cover six modules on various aspects of open science and have received positive feedback. The University Library also created a targeted web page specifically for early career researchers, featuring information on training, calls for proposals, and opportunities relevant to open science. This resource has been highly valued by our cohort.

At the Faculty of Law, we have a dedicated librarian who is both a researcher and research manager. She complemented the university library’s training by providing individual sessions or consultations as needed. She delivered training on open science publication skills and, importantly, on repositories. Although we have national and institutional repositories, they can be difficult to navigate. She worked individually with early career researchers to train them in their use, saving them time and facilitating their work.

We originally planned for 65–70% of our cohort to be trained in repositories, but as the university issued a call for funding proposals that required repository use, all cohort members required this training. This demonstrates that when a skill becomes necessary, researchers prioritise it, and we achieved more than anticipated.

For valorisation, the Science Outreach Centre led two types of activities: awareness-raising for researchers and public speaking. We trained researchers on the importance of outreach and on using the project repository. We also provided training in outreach activities, recognising that researchers need to develop these skills. An expert delivered two outreach training sessions during the project.

We built on existing infrastructure, such as the newly established centres, to maximise their value for the university and to embed activities into long-term institutional goals. A key advantage was having a core team member, Sasha, who was formerly vice director. This ensured full support from university management, which is essential for success.

We also created a repository for outreach activities, though it was developed by non-experts due to limited funds. In future, we hope to involve more experts and allocate funds for such work. We used common tools such as YouTube and WordPress, as we could not afford dedicated IT support.

There were many challenges and lessons learned. For any organisation wishing to implement similar activities, I recommend careful consideration of these lessons. Collaboration with the three university centres was vital; without them, we would not have achieved our goals. We assigned partial working hours to library staff for OPUS and open science, but their main duties remained library work, making it difficult to dedicate sufficient time to the pilot. In future, it would be beneficial to have dedicated units and explicit time allocations in contracts, as well as support and incentives for staff. Enthusiasm is important, but it is not enough for sustained, long-term activities.

Open science is known in Croatia, but practices vary greatly between disciplines. Training is available, but not mandatory, so implementation often depends on the team leader or principal investigator. We aimed to change this by focusing on early career researchers, believing that bottom-up influence can be more effective than targeting senior staff.

From the outset, we involved all relevant stakeholders, including the cohort and centre staff, and designated a contact person in each unit. We held an initial meeting to explain roles and reporting, and maintained regular monthly contact, sometimes more frequently, to gather feedback and adjust as needed. This continuous risk mitigation strategy contributed to our success.

For early career researchers, we addressed the lack of tailored support by creating the Edu Doc web page, a one-stop shop for information, contacts, and resources. In terms of rewards, we could only offer certificates, but I strongly recommend more generous recognition in future projects. We hope to introduce more incentives, and our colleague Sasa has worked to influence national research assessment frameworks, advocating for the inclusion of both qualitative and quantitative measures and for open science. We are not yet certain of our impact, but we believe our voice is being heard.

Sustainability was a priority. The timing was ideal, as we had just joined CoARA and were developing our CoARA Action Plan and renewing our HR Strategy for Researchers, which required new analyses and action plans. We have also continued to apply for projects, including the CoARA Boost project with the University of Cyprus, and are now in the implementation phase.

In summary, even if activities are not mandated at the national level, institutions can take action. One should aim for evolution, not revolution, small steps can lead to meaningful change. Academic cultures are difficult to shift, but dialogue and communication are essential. Our activities have increased the university’s visibility nationally and internationally.

University Nova of Lisbon (UNL) – Pilot Action

Isabel L. Nunes, Vice-Rector of NOVA University Lisbon responsible for Strategic Planning and Quality Management, demonstrated how the RAF guides institutional policy design and provides concrete support to researchers. 

Isabel L. Nunes, Vice-Rector of NOVA University Lisbon

NOVA University Lisbon’s implementation focused on translating the RAF into practical institutional policies and procedures. The university’s approach emphasised providing concrete support to researchers whilst ensuring alignment with European initiatives and national strategies. 

Creating a comprehensive open science strategy for the University will be essential to ensure the sustainability of these practices, which was one of our main goals.

Isabel L. Nunes, NOVA University Lisbon: We tested and adapted open science practices in different research contexts

To meet our objectives, we implemented our pilot across two distinct research centres, as already mentioned by Gareth: the Global Health and Tropical Medicine Research Centre and the Information Management Research Centre. These centres have different disciplinary profiles, ensuring diversity in needs and perspectives.

Our cohort consisted of fifteen researchers, mostly early career, from these two research centres. The aim was to test and adapt open science practices in different research contexts, promoting broad adoption across the institution after the project’s conclusion. The pilot focused on four main categories: data, software, publications, and engagement.

The interventions related to policy development, human resource allocation, repository selection, awareness raising, and training workshops.

Some of the key actions we implemented over the eighteen months of the pilot included the following. In the absence of a national open science policy on data, software, and citizen science, NOVA developed and launched the NOVA Open Science Guide to support researchers and ensure alignment with EU standards. This was coordinated by my colleague, Isabel Hoch, Vice Rector for Research and Innovation.

Regarding human resources, library staff were designated as open science reference points, guiding researchers and delivering training and awareness sessions. This demonstrated that internal staff can effectively support open science practices.

NOVA also defined specific repositories for different research outputs: Zenodo for depositing research data, GitHub for managing and sharing software, and RUN, NOVA’s institutional repository, for open access publications, which is integrated with the national repository portal. PURE, our CRIS system, is used for internal monitoring so we can track open science research outputs.

To promote awareness and training, we organised awareness sessions focused on institutional expectations and open science values, as well as seven practical workshops on topics such as open science publication, data management, software sharing, citizen science, and the use of PURE.

We monitored several open science indicators through the OPUS pilot, comparing the starting point in January 2024 with the results achieved by June 2025. Regarding openly available data sets, the Global Health and Tropical Medicine Research Centre began with none and reached the target of three data sets. The Information Management Research Centre also started with none and managed to publish one data set, which, although short of the target of three, reflects their more software-driven focus.

For software outputs, the Information Management Research Centre achieved their full target of three software sets, demonstrating strong engagement with FAIR and open source practices. In terms of open access publications, both research centres started with 68% of their publications openly available, and the cohort successfully achieved one hundred open access publications.

Regarding citizen science, no new citizen science projects were launched during the project, but we did produce three materials and conduct eight citizen engagement activities. This shows partial progress and highlights the challenge of launching fully fledged citizen science projects within a short timeframe, particularly with limited resources and early career researchers. Overall, these results demonstrate NOVA’s progress in key open science areas, especially open access and software, and provide insight into where further institutional support and alignment are needed.

During this journey, we faced several challenges, which provided valuable insights. The first challenge was the delay in national policies for open science. To address this, NOVA produced an internal open science guide, which was essential in supporting researchers in the absence of a national framework.

Another challenge was the lack of dedicated funding and human resources for open science implementation. Library staff served as our open science reference points and did an excellent job, but more support is needed. Retaining early career researchers was also difficult due to contract instability; we learned that sustainable participation requires more stable career pathways and greater institutional support for early career researchers.

Limited awareness of institutional practices among researchers was another challenge. We believe that introducing awareness campaigns is key to embedding open science as a shared institutional value.

In the second phase of implementation, around the ninth month, we experienced low engagement in training due to academic overload, as it coincided with student evaluations and project grant applications. We learned that training should be aligned with the academic calendar and should also offer asynchronous options, such as short videos available online at any time.

Across our cohort, we identified inconsistent use of repositories. Standardising repositories, such as Zenodo and GitHub, would improve monitoring and visibility of open science practices. The absence of a clear repository for citizen science activities was another challenge, indicating that repository solutions must be planned early and supported by policy and infrastructure guidance.

At NOVA, we chose to reward researchers participating in the pilot by considering open science practices in their career progression. However, open science contributions are not yet fully integrated into researchers’ career evaluations. To overcome this, we believe that including open science indicators in the evaluation framework could promote the adoption of open science practices.

University of Cyprus (UCY) – Pilot Action

Panagiotis Moiras, MSc, BA (Hons), MCIM, Senior Research and Project Management Specialist at the University of Cyprus (UCY), presented the Cyprus pilot’s focus on integrating the RAF into technical practices. Moiras, who serves as Director of the Research and Innovation Support Service (RISS), brings more than 15 years of professional experience in project management and communications consultation.

Panagiotis Moiras, MSc, BA (Hons), MCIM, Senior Research and Project Management Specialist at the University of Cyprus (UCY)

The Cyprus pilot, centred on the KIOS Centre of Excellence, focused on integrating the RAF into practices around publication drafting, data management, and software development. The university played a central role in promoting Open Science principles both institutionally and nationally, having become the first institution in Cyprus to sign the Berlin Declaration on Open Access to Knowledge in the Sciences and Humanities in 2008. The pilot’s achievements included instrumental roles in shaping national policy, leading to the approval of Cyprus’s open science frameworks in 2016 and 2022.

Panagiotis Moiras, UCY: Theoretical reforms were put into practice, tested and refined.

The pilots were the operational backbone of the OPUS project, where theoretical reforms were put into practice, tested and refined. Their success was critical to ensuring that the broader goals of aligning research assessment are met.

This succinctly reflects our journey at the University of Cyprus over the past two years as one of the designated pilot institutions. As a mid-sized, research-intensive university, the University of Cyprus faced both the challenge and the opportunity of translating aspirational reform into functional change.

Our efforts were implemented through the KIOS Research and Innovation Centre of Excellence, the largest information and communication technology research centre in Cyprus.

Turning to the core of our pilot actions, we undertook a multidimensional pilot as part of OPUS. The pilot was not only an opportunity to contribute to European research assessment reform but also to test internally how open science principles could be gradually institutionalised in a research-intensive environment. Our focus was on developing a contextualised, evidence-based, and action-oriented plan aligned with the University’s broader research strategy and the objectives of OPUS.

At the heart of the University of Cyprus pilot was a defined cohort of twelve early-career KIOS researchers. The steps we followed illustrate a structured and collaborative approach to preparing for implementation, ensuring that key roles, criteria, and internal consensus were established early.

The ambition was to create a microcosm of institutional change. The pilot was based on the OPUS Research Assessment Framework. Of the four dimensions, our pilot involved the research category, and included actions relevant to all five assessment categories: policy, resources, repository infrastructure, awareness raising, and training. Specifically, we focused on the subcategories of data, software, and publications, and their corresponding indicator groups.

Policy interventions began with a kickoff meeting of the KIOS Open Science Committee to establish a shared vision and commitment. A significant milestone was the identification and commitment of the twelve early-career researchers forming our core pilot group. We reviewed and identified existing institutional procedures that required amendment to better align with open science principles and to facilitate the pilot’s implementation. To reinforce institutional backing, we held a key meeting with the University’s leadership team, securing their support for the pilot.

An important initiative was the adaptation of the Personal Career Development Plan (PCDP) forms to include explicit references to open science practices. In parallel, the annual Open Science Award at KIOS was restructured to better reflect meaningful open science contributions by incorporating open science indicators.

Resource interventions included the introduction of a new, voluntary role: the OPUS, now called Open Science Champions. We trained two early-career researchers to act as focal points and ambassadors for open science within and beyond the organisation, providing guidance and support. We also assessed and allocated the necessary resources—time, tools, and institutional support. The Open Science Champions underwent a specifically tailored training programme.

Recognising the limitations of available resources, our action plan emphasised strategic planning and ad hoc resource allocation. This included identifying additional funding sources, submitting synergistic proposals, and developing low-cost, high-impact initiatives as proof-of-concept interventions.

Repository interventions focused on building bridges between existing structures. We established a communication channel between our institutional repository, GNOSIS, and those already used by KIOS. Specific actions aimed to improve the visibility and usability of current repositories and to align metadata practices with FAIR principles.

Awareness-raising interventions included several activities to foster broader institutional engagement. We launched an introductory seminar on open science to kick off the pilot, followed by targeted communications such as newsletters and social media campaigns to make open science principles more visible and accessible. We participated in thematic events, such as Open Science Week and Open Science Data Day, which also helped demystify open science for non-research staff and administrative stakeholders.

Training and capacity building involved a comprehensive set of activities: an introductory seminar, targeted training on tools and repositories, and a workshop for the Open Science Champions. This “train the trainer” intervention focused on strengthening communication and soft skills through interactive role-play scenarios. The training prepared participants to navigate diverse perspectives within the academic community. Our final training focused on developing an open science narrative CV. We also encouraged participation in other relevant training organised by stakeholders in the ecosystem, such as the UFA Open Science programme and online seminars and workshops held from November 2024 to January 2025.

Metrics and Outcomes

Marius Kyriakou, Early Career Researcher, KIOS Centre of Excellence

Thank you for the opportunity to join this project and present our results. For the publication drafting indicator group, we set initial targets for months 6, 12, and 18, and achieved our goals. For the data management indicator, we met the target for the number of downloads and came close to targets for other metrics. For the software development indicator, we greatly exceeded two of the four metrics. Researchers became more familiar with open science tools, and we saw more active engagement in open practices. Overall, this was a valuable experience for us at the KIOS Centre of Excellence.

Challenges and Lessons Learned

As with any reform effort embedded in a working institution, we encountered a range of technical, cultural, and structural barriers. Resistance to policy change was one of the most notable challenges. While many approached the interventions with interest, others raised concerns. It became clear that, beyond formal policies, real progress requires open dialogue, relatable examples, and a foundation of trust. Reform is most successful when it involves the academic community from the outset; bottom-up participation and recognition of researchers’ concerns can greatly smooth transitions.

Our pilot highlighted the need for more structured mechanisms to collect and monitor relevant indicators. While there is increasing emphasis on metrics, existing infrastructure and internal policies do not yet fully support systematic tracking. The lesson learned is that metrics are powerful tools, but only if supported by clear policies, trained personnel, and digital systems that make their collection and interpretation routine and meaningful.

As with many pioneering efforts, our pilot unfolded in a landscape of evolving capacity and support. For open science to move beyond rhetoric, institutions must strategically plan for human and financial resources, including lobbying for national-level support and seeking synergistic projects.

Main Achievements

Despite these challenges, our pilot at the University of Cyprus produced a substantive set of achievements:

  • Establishing a governance model: The KIOS open science function proved highly effective and can be replicated across departments and institutions.
  • Institutional learning: We observed increased institutional literacy around open science within our academic community.
  • Changes in research evaluation: By embedding open science criteria into PCDP forms and reinforcing open science with specific tangible and intangible awards, we fostered a process that rewards transparency and collaboration.
  • Progress towards CoARA alignment: Our pilot positioned the University to align with CoARA principles, including a shift from proxy metrics to value-based assessments.
  • Building momentum for cultural change: With the Open Science Champions leading the initiative, we created a new cohort of early-career researchers who are now both participants and advocates of open science. Their increased visibility, leadership, and reputational standing serve as powerful motivators and drivers of cultural change.

Reflecting on the OPUS pilot at the University of Cyprus, it was not just about testing tools or methodologies, but about surfacing assumptions and revealing where institutional inertia lies and where innovation can thrive.

Looking ahead, we have identified five milestones for our ongoing journey:

  1. Institutionalisation: Embed successful interventions into permanent structures.
  2. National engagement: Engage with national agencies and funders to align reward systems.
  3. Broaden participation: Scale these practices to other departments within the university.
  4. Invest in capacity: Improve infrastructure and develop tools, and assess human capital requirements.
  5. Learning from others: Continue engaging with other OPUS pilots, building on synergies and mutual learning.

At the University of Cyprus, we are proud to have contributed to this transformation and to the OPUS project, and we look forward to the future with both pride and anticipation. 

Beyond the Project Pilots: PLOCAN Action

Corina Moya-Falcón also presented PLOCAN’s approach that extended beyond the standard pilot framework, embedding Open Science principles in marine data management and software development for cutting-edge marine research. 

Corina Moya-Falcón, PLOCAN

As project coordinator, Moya-Falcón demonstrated how PLOCAN’s action transcended typical pilot limitations by integrating Open Science into core research infrastructure and data management systems.

Corina Moya-Falcón, PLOCAN: We have included open science practices in our strategic plan

During the development of this project, PLOCAN, as both a partner and coordinator, has learned a great deal and has also applied several interventions. I will share some of these interventions and how we have implemented them.

For those unfamiliar with PLOCAN, it is a research infrastructure located in the Canary Islands, devoted to marine observation. We offer various services to universities and other research organisations and are involved in numerous R&D projects. Our team consists of around seventy people, mainly research managers and technicians, many of whom wear two hats as both researchers and research managers, as Rinska mentioned earlier.

PLOCAN actively incorporates open science principles into its organisation, with a particular focus on marine data management and software development for innovative marine research solutions. We have applied interventions at different levels: policy alignment, awareness and training, and repository interventions.

To highlight some actions in terms of policy intervention, we have included open science practices in our strategic plan, which is crucial and aligned with the senior management of PLOCAN. This alignment is essential for applying these principles, particularly regarding data management and quality. We have developed protocols for data management, as we work with a significant amount of sensor data that is valuable for researchers worldwide. It is important to provide this data in accordance with FAIR principles so it can be widely used.

In terms of technological infrastructure, we have trained our staff to use online clouds and platforms for uploading marine data, which is vital for our research. Regarding data usability and accessibility, we have developed interactive visualisation tools to facilitate data access and analysis.

On the innovation and community engagement front, we have developed open source software and shared it with the scientific community. We have also organised open science and training workshops for our staff during the execution of this project.

We have faced several challenges and learned important lessons. First, institutional commitment is a key driver of change. Continuous training for our team members is also essential. The development of interactive tools has improved accessibility to our marine data. We have learned a great deal by observing how other pilots have implemented interventions in their organisations, and we have realised that aligning with European initiatives is also very important for our institution.

Q&A Session

Maya Sundukova (Marie Curie Alumni Association):
How did you pilot organisations choose the indicators and interventions? Did you base your choices on your own audit or knowledge of your community? Each organisation has its own priorities and values, so how did you decide, and what would you do differently now?

Alina Irimia (Executive Agency for Higher Education, Research, Development and Innovation Funding, Romania):
As I mentioned earlier, the timing was fortunate because we had to implement a national strategy that included objectives such as FAIR data, open access, and promoting citizen science. We chose our indicators based on these priorities, following the principle of “as open as possible, as closed as necessary” for data. We wanted to align research data management with FAIR principles and provide support on what to open and what not to open. In Romania, we do not have a national open access mandate, so we are taking a step-by-step approach, starting with DMPs and gradually implementing the national strategy.

Julija Baniukevic (Research Council of Lithuania):
We based our choices on the renewed national policy on open science, which was updated last September. Our actions were aligned with the national open science agenda.

Gareth O’Neill (TGB):
It is important to remember the limitations of the project. OPUS lasted three years, with the first year devoted to developing the framework. The pilots then had to select indicators, decide on relevant interventions, and embed these in their institutions. They also had to recruit cohorts of researchers to participate voluntarily, as participation could not be mandated. All of this had to be implemented and monitored within about 18 months, so choices were inevitably shaped by what was feasible within the project’s structure. The more indicators and topics you select, the more infrastructure and support is required. Achieving this within the timeframe, even with existing infrastructure, was a significant challenge.

Reda Cimmperman (Research Council of Lithuania):
As Julija mentioned, changes in regulation and personnel meant that we sometimes had to change our indicators during the project. Despite the short timeframe, a great deal was accomplished. Thank you to everyone who contributed and shared expertise.

James Morris, Science Europe:
What strikes me across all the pilot presentations is the underlying emphasis on training, guidance, and community building. Can these actions be repurposed or adapted for future interventions? Will it be easier to undertake further actions towards open science based on your experience?

Nataša Jakominić Marot, University of Rijeka:
Theoretically, yes, but we need resources and institutional alignment. We would need to involve more units and dedicate more resources to scale up these actions.

Isabel Hoch, NOVA University:
We share the same perspective. As we await national guidelines and implement our open science policy, we believe these practices can be institutionalised across all research units.

Panagiotis Moiras, University of Cyprus:
Previously, open science was perhaps seen as a buzzword by many in our academic community. Through specific actions in the project, it became more visible and tangible. These interventions must be repurposed and replicated, as change is gradual and requires repeated efforts.

Alina Irimia, UEFISCDI Romania:
Our OPUS experience now extends to other funding programmes. Projects like OPUS provide evidence that supports implementation. For open science, we need this kind of evidence to show what works and to engage with other funders, exchanging expertise to impact the national level.

Lesson learnt on how to better support OS policies at institutional level:  OPUS Policy Brief 

Supporting Open Science at Institutional Level

Juliana Chaves Chaparro, Senior Consultant at UNESCO’s Management of Social Transformations Programme (MOST), presented the OPUS Policy Brief on lessons learned for supporting Open Science policies at institutional level. Dr Chaves Chaparro brings nearly 25 years of experience in science-policy-society interface promotion, particularly in the Global South, with an advanced degree in Environmental Sciences and ongoing PhD in Sociology.

Juliana Chaves Chaparro, Senior Consultant at UNESCO’s Management of Social Transformations Programme (MOST)

Her presentation emphasised the importance of aligning policies across local, institutional, national, and European levels whilst integrating gender equality and diversity measures into Open Science initiatives. The policy brief concluded with recommendations for transforming barriers into policy actions, including developing coherent policies, securing resources, and supporting early career researchers and diversity.

Juliana Chaves Chaparro (UNESCO): Recommendations for transforming barriers into policy actions

The 2021 UNESCO Recommendation on Open Science extends the vision of equitable and accessible science. In this context, the OPUS Policy Brief focuses on how to transition to open science at the institutional level, drawing on lessons from five OPUS case studies. The brief addresses cultural, structural, and policy-related challenges, identifying key barriers and offering recommendations for local, institutional, national, and global levels.

Key Barriers and Challenges

  • Fragmented Policies and Teams: Institutional change is often hindered by fragmented policies and lack of coordination between teams. Some pilots faced limitations due to underdeveloped national policies or restricted mandates from ministries.
  • Resource Constraints: Limited resources, both financial and human, impede the effective implementation of open science.
  • Resistance to Change: Both conscious and unconscious resistance persists, rooted in traditional assessment criteria and status quo biases.
  • Precarity and Retention: High mobility, job insecurity, and low retention—especially among early career researchers and women—undermine long-term commitment to open science.
  • Recognition and Incentives: Open science contributions are often undervalued in career progression, leading to increased unpaid workload and limited motivation.
  • Technical and Capacity Gaps: Inconsistent technical expertise and limited capacity in social sciences and humanities restrict engagement with broader society.
  • Cultural and Structural Inequities: Disparities in access to resources and support, along with gender and diversity gaps, persist across institutions.

Good Practices and Lessons from OPUS

  • Participatory and Inclusive Processes: Early and active involvement of researchers, managers, and librarians in policy design and capacity building is crucial.
  • Diversity and Representation: Efforts to include a mix of career stages, backgrounds, genders, and underrepresented groups enhance policy relevance and effectiveness.
  • Leadership and Institutional Commitment: Visible support from top management, allocation of resources, and recognition of participants’ efforts are essential.
  • Ambassadors and Champions: Initiatives such as open science ambassadors or champions, as seen at the University of Cyprus, foster community engagement and advocacy.
  • Interim Policies: Where national frameworks are lacking, interim institutional policies help sustain open science practices and commitment.
  • International Exchange: Cross-country collaboration and knowledge exchange motivate and reward participants, enriching the implementation process.
  • Integration into Assessment: Embedding open science in research assessment frameworks, alongside capacity building and infrastructure, supports institutional change.

Remaining Challenges

  • Partial Understanding of Open Science: Open science is often limited to open access and a few additional activities, rather than a holistic approach as recommended by UNESCO.
  • Short Project Duration: The three-year timeframe of OPUS limited the scope for deeper transformation.
  • Policy Coherence: Lack of alignment between national and institutional policies makes it difficult to sustain progress.
  • Risk of Losing Momentum: Without coherent mandates and support, researchers may lose interest, jeopardising long-term change.
  • Gender and Diversity Issues: Persistent gender bias, lack of disaggregated data, and insufficient support for caregivers and underrepresented groups remain significant barriers.

Policy Recommendations

  • Develop and Align Policies: Ensure coherence across local, institutional, national, and European/global levels.
  • Secure Resources: Allocate dedicated funding and staff to support open science implementation.
  • Embed Open Science in Assessment: Integrate open science indicators into researcher evaluation and career progression frameworks.
  • Build Capacity and Awareness: Invest in sustained training, infrastructure, and community engagement.
  • Support Early Career Researchers and Diversity: Provide targeted support for early career researchers, caregivers, and underrepresented groups.
  • Promote Inclusivity: Mainstream gender equality and diversity in open science strategies and practices.
  • Centralise Infrastructure: Develop centralised, accessible infrastructure for data management, repositories, and monitoring.
  • Foster International Collaboration: Encourage participation in open science working groups and international knowledge exchange.

Sustainable transformation towards open science requires comprehensive, coordinated action across all levels. Piecemeal or selective adoption of open science practices is insufficient for real change. Advancing together—integrating gender equality, diversity, and inclusivity—will foster greater trust in science and broader societal engagement. Institutions are encouraged to participate in open calls for action, contribute to national monitoring systems, and engage with ongoing working groups on financing, incentives, policy, and strategy.

Research Assessment and Open Science: Reflections from the OPUS Panel

The Tension at the Heart of Research Assessment

The OPUS Final Conference’s panel on “Research Assessment and Open Science” brought together a formidable line-up of voices, policy architects, funders, academics, and early career advocates, to tackle the persistent disconnect between how research is assessed and how it ought to be done in the 21st century. With Open Science principles gaining traction in policy, the reality remains fraught with resistance, inertia, and a striking misalignment between ideals and incentives.

Misalignment at the Core: Ana Persic’s Perspective

Ana Persic (UNESCO) opened with a candid diagnosis: the current incentive system is fundamentally out of sync with Open Science. Drawing from her experience with the 2021 UNESCO Recommendation on Open Science, Persic highlighted that while the values of transparency, equity, inclusiveness, and collective benefit are now enshrined in global policy, the rewards and awards systems lag behind. For many institutions, operationalising these values means overturning deeply entrenched practices—a daunting prospect.

She pointed out that while Europe, under the CoARA initiative, is making strides, many regions have yet to start this conversation. The Recommendation itself is the product of years of dialogue with thousands of stakeholders, underscoring the universality of these values, but also the scale of the challenge in putting them into practice.

Integration, Not Isolation: James Morris on the Funders’ View

James Morris (Science Europe) offered a pragmatic take from the perspective of research funders. For Morris, the key challenge is integration, Open Science cannot be bolted on as an afterthought. It must be woven into career progression, project assessment, and the broader fabric of research culture.

He shared findings from a recent survey of Science Europe’s members: while open access and data sharing are well-supported, newer elements like open peer review and citizen science remain largely aspirational, appearing in strategy documents but lacking operational support. Morris illustrated the difference between compliance and genuine culture change with a simple question: if a researcher continues to write data management plans after moving to an institution where they are not required, has the culture changed? If not, the system is still stuck in a compliance mindset.

Equity and Access: Louise Bezuidenhout’s Challenge

Louise Bezuidenhout, representing Leiden University and the UNESCO Chair on Diversity and Inclusion in Global Science, shifted the focus to equity. She argued that openness is not synonymous with accessibility or fairness. Years of research have shown that current research assessment infrastructures often exclude scholars from low and middle-income countries due to language barriers, digital divides, and irrelevant metrics.

Bezuidenhout dissected the UNESCO Recommendation’s four core values, noting that only transparency aligns with traditional academic norms. The others, equity, collective benefit, inclusiveness—demand a shift from individual achievement to collective responsibility. This cultural transformation, she argued, is at the heart of the difficulty in reforming research assessment.

Quality and Societal Relevance: René von Schomberg’s Systemic Critique

René von Schomberg (RWTH Aachen University) delivered a robust critique of current assessment systems. He argued that the prevailing reward structures undermine both the quality of science and its relevance to society. Citing a former Nature editor’s claim that 60% of articles in the journal have irreproducible data, he illustrated the consequences of prioritising competition and output over reproducibility and societal value.

Von Schomberg was particularly critical of the gold open access model, which he said incentivises quantity over quality and has fuelled the rise of predatory publishing. His solution? Rethink funding structures entirely, shift from lump sum allocations to mission-based funding, and assess researchers by their contribution to societal missions, not just publications or citations.

The Voice of Early Career Researchers: Pil Maria Saugmann

Pil Maria Saugmann (Eurodoc) spoke passionately about the realities faced by early and mid-career researchers. She reminded the audience that Europe’s research and higher education system has a democratic mission, yet the changes OPUS has achieved in pilot institutions remain fragile, vulnerable to shifts in leadership and policy.

Saugmann called for deep structural change, not just policy tweaks. She emphasised the need for reform to be visible in job calls, hiring, and funding processes, the very mechanisms that shape academic careers. Precarity remains a central issue, as most assessment happens at the early and mid-career stage, where the stakes are highest. She also highlighted the persistent exclusion of early career researchers from academic governance, despite significant changes in the researcher landscape over the past quarter-century.

Trust, Economics, and the Cost of Inertia

Other voices in the discussion, such as Pierre Winicki from TrustInside and Sasa Zelenika from the University of Rijeka, broadened the debate. Winicki introduced the concept of a “tree of trust,” arguing that much resistance to Open Science is rooted in a culture of distrust, manifested in competition over collaboration and a lack of shared rules. He challenged the community to quantify the cost of distrust in science, suggesting that only then might leaders be compelled to act.

Zelenika underscored the economic inertia in the system, pointing out that major publishers generate billions in revenue while researchers do the work. He was sceptical that short-term, project-based reforms could deliver lasting change, arguing that meaningful transformation requires sustained investment and commitment.

A Call for Structural Courage

The panel concluded with a clear message: The current research assessment system is not just outdated, but an active barrier to Open Science. The challenge is not a lack of ideas or values, but a lack of structural alignment, inclusive governance, and institutional courage.

Systemic problems, exclusionary infrastructures, entrenched metrics, misaligned incentives, cannot be solved by simply adding Open Science to policy checklists. True progress will require coordinated, structural transformation: new ways of funding, assessing, and recognising research.

The OPUS project has proven that change is possible, but as the panellists reminded us, pilots and principles are not enough. What’s needed now is leadership, trust, and the collective will of the research community to stop waiting for permission and start shaping the future.

See the full report from the panel session here.

The OPUS Project: A Journey to Transform Research Assessment

After a day filled with presentations, panel discussions, and collaborative dialogue, three distinguished speakers took to the stage to reflect on what had been accomplished and what lay ahead for the future of research assessment.

A Foundation Built on Collaboration

Lamine Diagne, Senior Programme Officer at UNESCO’s Management of Social Transformation (MOST) Section, opened the closing remarks with a sense of both celebration and purpose. Standing before an audience of researchers, policymakers, and innovators from across the globe, he painted a picture of transformation that extended far beyond the conference room.

“Today we are not only celebrating the impressive outcomes of the OPUS project, but also reaffirming our shared commitment to advancing more open and responsible research and innovation,” Diagne declared, his words carrying the weight of years of collaborative effort.

The journey, he explained, had begun in 2017 with the RRING group projects, creating a foundation upon which OPUS had built something remarkable. This wasn’t simply about European achievement—it was about creating a global resource that would ripple outward, touching research institutions and policymakers worldwide.

What made Diagne’s message particularly compelling was his vision of practical impact. “The insights and deliverables from OPUS are not ends in themselves, rather they are stepping stones towards lasting change,” he emphasized, underlining how the project’s frameworks and tools would serve as invaluable resources for policymakers, research organisations, and funders well beyond Europe’s borders.

The Promise of Global Reach

The true power of OPUS, according to Diagne, lay not in its technical achievements alone, but in its potential to transform how science serves society. UNESCO’s commitment to disseminating the project’s results among delegations, partners, and policymakers, including at the Executive Board level, represented a commitment to scaling impact across continents.

“We aim to foster more pluralistic approaches, create new incentives for researchers worldwide and invent more societally inclusive scientific practices,” Diagne explained, connecting the project’s work to UNESCO’s broader mission of ensuring science serves all people, especially the most vulnerable groups in society.

His words carried particular resonance as he linked OPUS to UNESCO’s foundational recommendations on science and artificial intelligence ethics, demonstrating how the project aligned with broader efforts to build public trust in science and promote gender equality in research.

Bridging Policy and Practice

Rinske Van Den Berg, REA Project Policy Officer, brought a practitioner’s perspective to the closing session. Her brief but pointed remarks highlighted the practical challenges of translating conference insights into policy action.

“I’ve learned that dialogue and the connection between Member States and non-member States, or, let’s say, the whole world from UNESCO’s perspective, remains important,” Van Den Berg observed, acknowledging the complex web of relationships that must be navigated to achieve meaningful change.

Her emphasis on taking feedback “on paper to Brussels” for upcoming reviews with partners reflected the meticulous work of policy implementation—the unglamorous but essential task of ensuring that innovative ideas find their way into institutional frameworks and funding mechanisms.

Celebrating Achievement, Embracing Responsibility

Corina Moya-Falcón, OPUS Coordinator at PLOCAN, brought the day full circle with her comprehensive reflection on what had been accomplished. Her words painted a picture of rich dialogue and substantive progress—presentations of research assessment frameworks, experiences from pilot organisations, and panel discussions that had addressed both challenges and opportunities in research assessment within the context of open science.

“Your commitment and dedication have been fundamental to the success of this event,” Moya-Falcón told the assembled participants, her gratitude extending beyond mere politeness to recognition of the collaborative spirit that had made the project possible.

But perhaps her most important contribution was her acknowledgment of what the day represented: not an ending, but a beginning. “Thank you all for your participation and your commitment to transforming research assessment and open science,” she concluded, framing the work ahead as a shared responsibility.

The Road Ahead: Transforming European Research Assessment

The OPUS Final Conference represented more than a project conclusion; it marked the beginning of a new era in research assessment that prioritises Open Science practices, equity, and societal impact. The comprehensive framework developed through the project, tested across diverse institutional contexts, and refined through extensive stakeholder engagement provides a robust foundation for continued reform.

The conference highlighted how the OPUS Researcher Assessment Framework offers a practical, scalable model that helps reshape how we assess, support, and reward researchers in the 21st century. By addressing both research and non-research activities, combining quantitative and qualitative approaches, and providing flexibility for institutional adaptation, the RAF represents a significant advancement in research evaluation methodology.

As participants reflected on the journey from traditional metrics to comprehensive Open Science assessment, the conference demonstrated that meaningful reform requires sustained collaboration between researchers, policymakers, institutions, and funding organisations. The OPUS project’s legacy lies not only in the tools and frameworks developed but in the community of practice established to continue advancing Open Science principles across European research ecosystems.

The Paris conference thus served as both celebration of achievements and launching point for the next phase of research assessment reform, ensuring that Open Science becomes not merely an aspiration but the foundation of how research excellence is recognised and rewarded across Europe and beyond.

Rethinking Research Assessment: Aligning Open Science with Real-World Reform
Rethinking Research Assessment: Aligning Open Science with Real-World Reform 1024 683 Open and Universal Science (OPUS) Project

Panel: Research Assessment and Open Science
Moderator: Gareth O’Neill (OPUS Scientific Coordinator, TGB)
Panelists:
– Ana Persic (UNESCO)
– James Morris (Science Europe)
– Louise Bezuidenhout (Leiden University)
– René von Schomberg (RWTH Aachen University)
– Pil Maria Saugmann (Eurodoc)

The tension between how research is assessed and how science ought to be done in the 21st century is sharper than ever. With Open Science principles increasingly accepted in policy, the reality on the ground still tells a story of resistance, inertia, and misalignment. This panel brought together diverse perspectives, from global policy architects to early career researchers, to explore why research assessment reform is proving so difficult, and what it will take to make change happen.

“We Have a Problem of Misalignment”

Ana Persic from UNESCO opened the discussion with a clear message: the current incentive system is out of step with what Open Science asks of researchers. From her position coordinating the implementation of the 2021 UNESCO Recommendation on Open Science, she sees this misalignment every day. “The incentives, the rewards, the award system is for the moment not aligned with the principles of open science,” she said.

This isn’t just an abstract misfit. It’s a roadblock. The Recommendation frames assessment reform as a cornerstone of open science, yet progress is patchy. Europe is moving, particularly under the CoARA initiative, but in many other regions the conversation hasn’t even started. “We’ve seen a lot of movement towards research assessment reform in Europe, some in Latin America. Other regions are currently not yet engaging in this conversation.”

The Recommendation, she reminded the audience, wasn’t built from scratch. It emerged from years of dialogue involving thousands of stakeholders. Its values, transparency, equity, inclusiveness, collective benefit, are universal. But for many institutions, operationalising these values means upending entrenched practices.

“You Can’t Talk About Open Science in Isolation”

James Morris from Science Europe came in with a practical view from the funders’ side. For him, the key challenge is integration. “You can’t think about open science in isolation. You have to think of it from the perspective of career progression, of how you assess projects, of how you assess researchers, and how that all fits into this broader concept of research cultures.”

Science Europe recently surveyed its member organisations to see where implementation stands. Some parts of Open Science, like open access publications and data sharing, are already well-supported. Infrastructure exists. Training exists. But newer elements, like open peer review or open research methods, remain aspirational. “They’re in strategy documents, but they often don’t have the operational or policy support yet.”

He shared a telling example: data management plans. “If a researcher is funded by an organisation that mandates a data management plan and then moves to one that doesn’t, do they continue to do it? Because they believe it’s good research practice?” If the answer is yes, then culture change has occurred. If not, the system is still stuck in compliance mode, and further thought needs to be given to how interventions can foster more sustainable change.

The Equity Reality: “Opening Up Isn’t the Same as Access”

Louise Bezuidenhout, speaking from Leiden University and the UNESCO Chair on Diversity and Inclusion in Global Science, pushed the discussion deeper. Her central point: openness is not the same as accessibility or equity.

“For many years, we’ve been gathering data on how the current infrastructures supporting research assessment are exclusionary for scholars from low and middle-income countries.” She pointed out that the global South faces multiple invisible barriers, language, digital infrastructure, relevance of metrics, and that simply making data open doesn’t make it usable or meaningful for local communities.

Her dissection of the Recommendation’s values struck a chord: “Of the four core values—transparency, equity and fairness, collective benefit, and diversity and inclusiveness—only transparency is aligned with traditional academic values. The others are about relationships, about dialogue.” And that shift—from individual performance to collective responsibility, is at the heart of why reform is so hard.

“It’s About the Quality of Science, and Its Relevance to Society”

René von Schomberg from RWTH Aachen came with the strongest systemic critique. For him, the way research is currently assessed undermines both scientific quality and societal relevance. “There are actually two bigger reasons for advocating open science,” he said. “One is about the quality of science, and the other one is making science more responsive to societal challenges.”

His examples were blunt. “A former Nature editor told me that 60% of articles in Nature have underlying data that is not reproducible.” That, he joked, would have failed his high school physics class. But the deeper issue is that the current reward structure favours competition and output, not reproducibility or relevance.

He slammed the gold open access model, which, he said, incentivises publishing “as much as possible, whether it’s relevant or not.” The rise of predatory publishing, particularly in biomedical science, is a symptom of this. “In 2015, 6% of journals were predatory or mega channels. Today, it’s 55%. Some publish 10,000 articles per year. That’s 25% of the research output.”

His solution? A total rethink of funding structures. “Instead of giving universities lump sums, fund missions. Assess researchers on how they contribute to societal missions.” It’s not about publications or citations. “It’s about your contribution to the goal to which you’re attached.”

“The System Doesn’t See Us, But We Are the System”

Pil Maria Saugmann from Eurodoc gave voice to those most affected by assessment systems: early and mid-career researchers. Her message was both passionate and precise. “When we talk about research and higher education, I think we should remind ourselves that in Europe, there is a democratic mission.”

OPUS, she said, has demonstrated that change is possible. “We’ve seen real change in the pilot institutions. But it’s vulnerable change. If the next academic leadership doesn’t believe in it, it disappears.”

Her call was for structural change, not just policies, but the very processes that define academic life: habilitation, hiring, funding applications. “If we want reform to matter, I have to see it in the job calls that I and my friends apply to. Not just in strategy documents.”

She also highlighted the issue of precarity. “Most assessment happens at the early and mid-career level. That’s where your next step is on the line. So the reward can’t be a nice diploma on my wall. I have enough of those.”

Most powerfully, she tackled governance. “We’ve changed the researcher landscape in the last 25 years, but not the entry points into academic governance. We are structurally excluded.”

The Trust Gap: “We Are Designing Based on Distrust”

Pierre Winnicke from Trust Inside brought a different lens: trust. For him, much of the resistance to open science isn’t technical, it’s cultural. “We have a leadership based on distrust. Total distrust, probably at every level.”

He introduced the “tree of trust” model, which maps out patterns of dysfunction: lack of co-responsibility, lack of shared rules, competition over collaboration. “Universities and labs are competing with each other instead of sharing. That’s a sign of distrust.”

He posed a provocative question: “Has anyone ever measured the cost of distrust in science, in euros or in political votes? Because once we have that number, maybe leaders will start to act.”

The Economics of Inertia

Sasa Zelenika from the University of Rijeka brought it all back to hard numbers. “The big publishers make €28 billion per year. Multiply that by 7 for a multi-annual framework, roughly €200 billion. That’s twice the Horizon Europe budget.”

And yet, it’s researchers who do the work: “We write the articles. We review them. We edit them. And publishers make the money. That’s a structural problem with values.”

He was equally critical of project-based reform. “You can’t change a system sustainably with a two-year pilot and €500,000. That’s not enough to shift institutions.”

“If We Stay Quiet, Nothing Will Change”

The discussion closed with a sense of urgency. Saugmann summed it up: “We all have to engage in the conversation about FP10. Not just early career researchers, everyone. Because if we’re quiet, nothing happens.”

Persic echoed the call. “The need for science is huge. And people are starting to realise that. That’s our opportunity, to agree on better ways of doing science.”

Von Schomberg offered cautious optimism: “We now have the tools to work openly, to share data and knowledge. The technology is there. The question is: Will we use it to transform the system?”


It’s Time for Structural Courage

The panel made one thing clear: the current system of research assessment is not just outdated—it is actively blocking the implementation of open science. The issue is not a lack of ideas or values. It is a lack of structural alignment, governance inclusion, and institutional courage.

From exclusionary infrastructures and metrics that entrench inequality, to wasteful funding systems and perverse incentives, the problems are systemic. They won’t be solved by adding open science to policy checklists. They will only be solved through coordinated structural transformation, through new ways of funding, assessing, and recognising research.

The OPUS project has shown that change is possible. But as the panelists reminded us, we cannot rely on pilots and principles alone. This will take leadership, trust, courage, and the willingness of the research community to stop waiting for permission and start shaping the future.

The time for marginal reform has passed. What comes next must be nothing short of transformation.

LIVE from Paris: The OPUS Final Conference on Redefining Researcher Assessment
LIVE from Paris: The OPUS Final Conference on Redefining Researcher Assessment 1024 771 Open and Universal Science (OPUS) Project

The OPUS (Open and Universal Science) Consortium is currently holding its Final Conference at the UNESCO International Institute of Educational Planning (IIEP) in Paris, a key OPUS event bringing together researchers, policy makers, and institutional leaders to shape the future of Open Science and responsible research practices across Europe and beyond.

This important gathering reflects on OPUS’s mission to transform research assessment systems—moving away from traditional metrics towards frameworks that incentivise and reward Open Science practices.

Key highlights from the event so far:

Welcome Remarks and Opening Perspectives

The conference started with welcome remarks from three distinguished leaders who established the importance of inclusive, responsible research ecosystems. 

Corina Moya-Falcón, OPUS Coordinator from the Oceanic Platform of the Canary Islands (PLOCAN), opened the proceedings. 

Gustavo Merino, Director of UNESCO’s Social Policies Division, delivered keynote remarks positioning OPUS’s work within the global movement for equitable knowledge sharing. Dr Merino, who has served as Director of Social Policies at UNESCO since January 2024, brought his extensive background in strategic planning, policy formulation, and international development to the conference. Previously serving in senior positions within the Mexican government, including Deputy Minister of Social and Human Development, and as Director of the Investment Center Division at the Food and Agriculture Organization (FAO), Merino emphasised UNESCO’s longstanding commitment to supporting Open Science as a driver of sustainable development and social progress.

Representing the European Commission, Rinske Van Den Berg, REA Project Officer, provided crucial policy perspective on the importance of research assessment reform. Van Den Berg, who serves as Project Adviser at the European Research Executive Agency (REA), brings over two decades of experience in project management and partnership development with European universities and organisations. Her background includes significant expertise in ERASMUS+ and HORIZON2020 programmes, making her well-positioned to address the challenges faced by Research Funding Organisations (RFOs) and Research Performing Organisations (RPOs) in adopting new assessment criteria.

OPUS Researcher Assessment Framework (RAF)

New Approach to Research Evaluation

The centrepiece of the conference was the presentation of the OPUS Researcher Assessment Framework (RAF) by Gareth O’Neill from Technopolis Group Belgium (TGB) and Emma Day from Vitae. O’Neill, Principal Consultant on Open Science at Technopolis Group, specialises in the implementation of Open Science across Europe, particularly focusing on sustainable careers for researchers and research assessment reform. His extensive background includes serving as former president of the European Council of Doctoral Candidates and Junior Researchers (Eurodoc) and working as an ambassador for Open Access and Plan S under cOAlition S.

Emma Day, Head of Policy and Engagement at Vitae, brings strategic leadership in policy and advocacy to the RAF development. Together, they presented this modular and flexible framework specifically designed for Research Performing and Funding Organisations (RPOs/RFOs) to embed Open Science into their policies and practices, supporting systemic reform across Horizon Europe and beyond.

The RAF represents a paradigm shift from traditional assessment metrics, offering a comprehensive suite of indicators and metrics that combine both quantitative and qualitative assessment methods. Unlike conventional approaches that focus primarily on publication metrics, the RAF evaluates researchers across four key categories: research, education, leadership, and valorisation activities. Each category includes both generic indicators for activities not specified in terms of openness and Open Science-specific indicators that explicitly recognise and reward Open Science practices.

The framework’s ten guiding principles ensure transformative impact on research evaluation practices across geographical boundaries and disciplinary constraints. These principles emphasise flexibility, allowing organisations to select, refine, and prioritise indicators according to their specific organisational interests and needs whilst maintaining universal applicability across countries, disciplines, and organisation types.

Pilot Actions and Mutual Learning

Raquel Vega from the Young European Research Universities Network (YERUN) introduced the pilot action segment, framing it as an opportunity for mutual learning and practical experimentation. Vega, who serves as Policy and Project Officer at YERUN, emphasised the collaborative effort that characterised the OPUS pilots and the importance of context-specific solutions.

The pilot programme involved five organisations, three Research Performing Organisations (RPOs) and two Research Funding Organisations (RFOs), who tested the RAF through tailored action plans and co-created strategies for meaningful reform. Each pilot organisation brought unique perspectives and challenges to the implementation process.

Romania: National Strategy Alignment

Alina Irimia, Open Science Projects Coordinator at the Executive Agency for Higher Education, Research, Development and Innovation Funding (UEFISCDI), presented Romania’s comprehensive approach to Open Science implementation. Irimia showcased UEFISCDI’s strategic alignment with Romania’s National Open Science Strategy, including the implementation of FAIR Data Management Plans.

UEFISCDI, as Romania’s main funder of competitive research and important policy advisor to both Ministries of Education and Research, Innovation and Digitalisation, has taken a leadership role in developing the National Open Science Strategic Framework. The organisation’s pilot activities focused on analysing, developing, and testing the implementation of the Openness Researcher profile within national platforms, particularly BrainMap, which hosts over 60,000 registered researchers, technicians, and entrepreneurs from around the world.

The Romanian pilot emphasised expanding the definition of valuable research outputs beyond traditional publications and citations to include databases, software, policy briefs, and contributions to public understanding of science. This approach directly aligns with CoARA’s commitment to recognising diverse research outputs and activities whilst supporting the implementation of strategic objectives defined at national level for the transition to Open Science.

Lithuania: Community-Driven Training Excellence

Julija Baniukevic from the Research Council of Lithuania (RCL) presented an exemplary community-driven approach to Open Science implementation. Dr Baniukevic, who serves as OPUS Project coordinator in Lithuania, has extensive experience in bionanotechnology research, policy advocacy, and European science cooperation.

The Lithuanian pilot implemented a comprehensive training programme over nine months, reaching 21 principal investigators from various research fields who subsequently trained 211 researchers in their teams. The programme achieved remarkable engagement levels, with 91% attending on-site training and 20 out of 21 trainers conducting sessions for their teams. Participants showed exceptional commitment, with 68.2% of research outputs made openly accessible and 82.6% of participants initiating open science dialogues in their institutions.

The success of Lithuania’s approach was attributed to strong administrative support, establishment of an advisory board with diverse perspectives, and emphasis on broadening researchers’ understanding of Open Science beyond just open access. The initiative led to wider dissemination of Open Science knowledge, with participants organising full-day conferences and participating in panel discussions at other events.

Portugal: Institutional Policy Design

Prof. Isabel L. Nunes, Vice-Rector of NOVA University Lisbon responsible for Strategic Planning and Quality Management, demonstrated how the RAF guides institutional policy design and provides concrete support to researchers. Prof. Nunes, an Associate Professor with tenure at NOVA FCT, brings extensive expertise in Industrial Engineering, particularly in Ergonomics and Human Factors, with over 100 scientific publications.

NOVA University Lisbon’s implementation focused on translating the RAF into practical institutional policies and procedures. The university’s approach emphasised providing concrete support to researchers whilst ensuring alignment with European initiatives and national strategies. Prof. Nunes’s presentation highlighted successful practices implemented during the university’s participation as a pilot institution, demonstrating the RAF’s practical applicability in higher education settings.

Croatia: Research Centre Excellence

Nataša Jakominić Marot, Head of the University of Rijeka Centre for Research and Innovation (UNIRI), presented the Croatian pilot’s comprehensive approach to Open Science integration. With over two decades of experience in driving innovation and research within academia, Jakominić Marot coordinates teams to achieve impactful results whilst contributing to meaningful societal change.

The University of Rijeka pilot focused on implementing Open Science initiatives across research centres, emphasising practical applications in data management, software development, and publication practices. The University’s approach included developing an Open Science Guide, designating library staff as reference points, and selecting specific repositories for different research outputs. While achieving progress in areas like open access publications and software sharing, the pilot encountered challenges including delays in national policies and difficulty retaining early career researchers.

Cyprus: Technical Innovation Focus

Panagiotis Moiras, MSc, BA (Hons), MCIM, Senior Research and Project Management Specialist at the University of Cyprus (UCY), presented the Cyprus pilot’s focus on integrating the RAF into technical practices. Moiras, who serves as Director of the Research and Innovation Support Service (RISS), brings more than 15 years of professional experience in project management and communications consultation.

The Cyprus pilot, centred on the KIOS Centre of Excellence, focused on integrating the RAF into practices around publication drafting, data management, and software development. The University played a central role in promoting Open Science principles both institutionally and nationally, having become the first institution in Cyprus to sign the Berlin Declaration on Open Access to Knowledge in the Sciences and Humanities in 2008. The pilot’s achievements included instrumental roles in shaping national policy, leading to the approval of Cyprus’s open science frameworks in 2016 and 2022.

Beyond Project Pilots: PLOCAN’s Innovation

Corina Moya-Falcón also presented PLOCAN’s approach that extended beyond the standard pilot framework, embedding Open Science principles in marine data management and software development for cutting-edge marine research. As project coordinator, Moya-Falcón demonstrated how PLOCAN’s action transcended typical pilot limitations by integrating Open Science into core research infrastructure and data management systems.

Policy Implementation and Strategic Insights

Supporting Open Science at Institutional Level

Juliana Chaves Chaparro, Senior Consultant at UNESCO’s Management of Social Transformations Programme (MOST), presented the OPUS Policy Brief on lessons learned for supporting Open Science policies at institutional level. Dr Chaves Chaparro brings nearly 25 years of experience in science-policy-society interface promotion, particularly in the Global South, with an advanced degree in Environmental Sciences and ongoing PhD in Sociology.

Her presentation emphasised the importance of aligning policies across local, institutional, national, and European levels whilst integrating gender equality and diversity measures into Open Science initiatives. The policy brief concluded with recommendations for transforming barriers into policy actions, including developing coherent policies, securing resources, and supporting early career researchers and diversity.

Chaves Chaparro leads UNESCO’s involvement in the OPUS project and has coordinated EU-funded initiatives like GRRIP and RRING. She founded the NGO MiXtura to promote interdisciplinary and responsible science and actively contributes to the European Open Science Cloud and other international science initiatives.

Panel Discussion: Research Assessment and Open Science

Diverse Perspectives on Systemic Reform

The conference featured a distinguished panel discussion on “Research Assessment and Open Science,” led by Gareth O’Neill, OPUS Scientific Coordinator from TGB. The panel brought together leading voices from various sectors to address challenges and opportunities in research assessment reform.

Ana Persic from UNESCO contributed her expertise as Programme Specialist for Science Technology and Innovation Policies and Open Science. Dr Persic, who coordinated the development of the UNESCO Recommendation on Open Science, emphasised that research assessment reform is crucial for enabling Open Science practices, as current incentives and rewards are misaligned with Open Science principles.

James Morris, Senior Policy Officer at Science Europe, shared insights from Science Europe’s membership study, noting that whilst Open Science is included in strategies, there exists a significant gap in practical implementation, especially for emerging Open Science practices. Morris, who works broadly on Research Culture, Research Assessment, and Research Infrastructures, brought valuable perspective on European-level policy coordination.

Louise Bezuidenhout from Leiden University emphasised the importance of considering research culture and impact beyond traditional metrics. Dr Bezuidenhout, a social science researcher specialising in Open Science, data sharing, and access, advocated for reforms that reflect values of equity, fairness, and inclusion. Her research focuses on improving the inclusion of low/middle-income country researchers into the Open Science landscape.

René von Schomberg from RWTH Aachen University contributed his extensive expertise in Responsible Research and Innovation. Dr von Schomberg, who holds PhDs from the University of Twente in Science and Technology Studies and Goethe University in Philosophy, was affiliated with the European Commission from 1998-2022 and is recognised as a leading expert in decision making under scientific uncertainty and the role of scientific advice in policy.

Pil Maria Saugmann, President of Eurodoc, represented the perspective of early and mid-career researchers. Dr Saugmann, who holds a PhD in theoretical physics from Stockholm University, emphasised the importance of addressing systemic barriers, including precarity in academic careers and the exclusion of early career researchers from academic governance.

Wrap-up and Future Directions

Commitment to Continued Reform

The conference concluded with comprehensive wrap-up remarks from three key figures who reinforced the importance of sustained commitment to Open Science reform. Lamine Diagne, Senior Programme Officer from UNESCO’s MOST Section, highlighted the global relevance of the project’s outcomes and committed to disseminating them widely. Diagne, who serves as Programme Specialist and GHA Coordinator in the Social and Human Sciences sector, brings extensive experience in coordinating international programmes and research initiatives.

Rinske Van Den Berg, REA Project Policy Officer, reiterated the European Commission’s ongoing support for initiatives that promote openness, transparency, and societal impact in research. Her concluding remarks emphasised the need for continued collaboration among stakeholders and sustained policy support for research assessment reform.

Corina Moya-Falcón, OPUS Coordinator from PLOCAN, provided final reflections on the project’s achievements and the path forward for implementing Open Science reforms across European research institutions. Her remarks emphasised the importance of maintaining momentum generated by the OPUS project and translating insights into sustainable institutional changes.

Transforming European Research Assessment

The OPUS Final Conference represented more than a project conclusion; it marked the beginning of a new era in research assessment that prioritises Open Science practices, equity, and societal impact. The comprehensive framework developed through the project, tested across diverse institutional contexts, and refined through extensive stakeholder engagement provides a robust foundation for continued reform.

The conference highlighted how the OPUS Researcher Assessment Framework offers a practical, scalable model that helps reshape how we assess, support, and reward researchers in the 21st century. By addressing both research and non-research activities, combining quantitative and qualitative approaches, and providing flexibility for institutional adaptation, the RAF represents a significant advancement in research evaluation methodology.

As participants reflected on the journey from traditional metrics to comprehensive Open Science assessment, the conference demonstrated that meaningful reform requires sustained collaboration between researchers, policymakers, institutions, and funding organisations. The OPUS project’s legacy lies not only in the tools and frameworks developed but in the community of practice established to continue advancing Open Science principles across European research ecosystems.

The Paris conference thus served as both celebration of achievements and launching point for the next phase of research assessment reform, ensuring that Open Science becomes not merely an aspiration but the foundation of how research excellence is recognised and rewarded across Europe and beyond.

As the conversation around Open Science continues to evolve, OPUS is proud to contribute a practical, scalable model that helps reshape how we assess, support, and reward researchers in the 21st century.

Explore our Key Exploitable Results here.

Stay tuned for more updates from the conference and visit opusproject.eu to explore all pilot achievements.

OPUS Final Conference to Take Place in Paris, 9–10 July 2025
OPUS Final Conference to Take Place in Paris, 9–10 July 2025 1024 576 Open and Universal Science (OPUS) Project

Shaping the Future of Research

The OPUS (Open and Universal Science) Project Consortium will hold its Final Conference on 9–10 July 2025 at the UNESCO International Institute of Educational Planning (IIEP) in Paris. The two-day event will bring together leading voices in research, policy, and academia to discuss the future of Open Science and responsible research assessment across Europe and beyond.

The conference will highlight the main outcomes and key exploitable results of the OPUS project, with particular focus on how Open Science policies can be practically implemented at both institutional and national levels. Participants may attend either in person or online via Zoom.

More information and registration for the online presence only

As a Horizon Europe project aiming to transform research assessment, OPUS has developed and piloted a new approach to recognising, incentivising, and rewarding Open Science practices. This includes the creation of new indicators, a comprehensive Researcher Assessment Framework (RAF), and tested interventions within Research Performing Organisations (RPOs) and Research Funding Organisations (RFOs).

Conference Highlights

Venue: UNESCO International Institute of Educational Planning (IIEP)
Address: 7–9 rue Eugène-Delacroix, 75116 Paris, France
Dates: 9–10 July 2025
Participation: Online (Zoom) – Register Here

Key Themes and Topics

Main Outcomes and Lessons Learned
Presentation of OPUS results, including the Research Assessment Framework (RAF), policy recommendations, and insights for RFOs and RPOs.

Panel Discussion – Research Assessment and Open Science
Moderated by Gareth O’Neill, OPUS Scientific Coordinator (TGB), with panellists:

  • Ana Persic (UNESCO)
  • James Morris (Science Europe)
  • Louise Bezuidenhout (Leiden University)
  • René von Schomberg (RWTH Aachen University)
  • Pil Maria Saugmann (Eurodoc)

Assessing Researchers and Incentivising Open Science
Explore new indicators covering open access, reproducibility, early sharing, open peer review, and stakeholder engagement.

The Researcher Assessment Framework (RAF)
A new evaluation system based on Proposal Development, Methods, Research Outputs, and Impact, with emphasis on Open Science contributions.

Piloting Research Assessment Reform
Showcasing pilot implementations at:

  • RPOs: Nova University of Lisbon, University of Cyprus, University of Rijeka
  • RFOs: UEFISCDI (Romania), Research Council of Lithuania

Supporting Interventions for Open Science Adoption
From policy shifts to training and repository development, OPUS interventions are designed to create sustainable Open Science cultures.

Progress and Policy Integration
The conference will address next steps, including:

  • Updated Open Science indicators
  • The Open Science Career Assessment Matrix (OS-CAM2)
  • Final policy briefs and recommendations for integrating Open Science into mainstream research evaluation.

About OPUS

The OPUS project is coordinated by The Oceanic Platform of the Canary Islands (PLOCAN) and implemented by a consortium of 18 organisations across Europe. These partners represent the full spectrum of Open Science expertise, spanning open access, inclusion, equity, and sharing, working together to reshape how research is assessed and valued.

Join us to explore how Open Science can be implemented not only as policy but as a lasting practice within the heart of research institutions and funding bodies.

More information and registration for the online presence only

UEFISCDI webinar dedicated to Data Management Plans
UEFISCDI webinar dedicated to Data Management Plans 683 1024 Open and Universal Science (OPUS) Project

On June 16, 2025, UEFISCDI organised a webinar dedicated to Data Management Plans within the framework of the OPUS project with approximately 170 registered participants from Romania’s research, development, and innovation (RDI) communities

OPUS Project: Reforming Research Assessment Through Open Science

The webinar opened with Dr. Alina Irimia, UEFISCDI’s Open Science Projects Coordinator, presenting the OPUS (Open Universal Science) project and Romania’s strategic contribution to this European initiative. OPUS is 36-month, €2 million Horizon Europe-funded project involving 18 partners across 11 European countries, specifically designed to reform research assessment systems at Research Performing Organisations (RPOs) and Research Funding Organisations (RFOs).

UEFISCDI’s role as a pilot Research Funding Organisation within OPUS cannot be overstated. The agency was actively testing the implementation of mandatory Data Management Plans across major national funding programmes, representing a €320 million budget allocation that will fund approximately 2,024 research projects over four to five years. This initiative represents the first systematic implementation of FAIR Data Management Plans in Romania’s research funding landscape.

The OPUS Researcher Assessment Framework

Dr. Irimia detailed the OPUS Researcher Assessment Framework (RAF), which introduces a paradigm shift in how researchers are evaluated. The framework encompasses five core dimensions:

  • Research: Including data management, methods, publications, and materials
  • Education: Covering courses, resources, teaching, and supervision
  • Leadership: Encompassing staff management, project coordination, and organizational roles
  • Valorisation: Focusing on knowledge transfer, engagement, and entrepreneurship
  • Proposals: Evaluating funding acquisition and project development

Significantly, each dimension includes both generic and Open Science-specific indicators, ensuring that researchers are incentivised and rewarded for adopting open science practices.

Data Management Plans: From Theory to Implementation

The webinar’s central focus shifted to practical aspects of Data Management Plans, expertly presented by Ioana Trif and Raluca Ciobotaru, both Open Science experts from UEFISCDI’s OPUS team. Their presentation provided participants with guidance on creating effective DMPs that align with both European standards and Romanian legal requirements.

Understanding Research Data and FAIR Principles

The speakers emphasised the critical distinction between open data and FAIR data, clarifying that while all open data should be FAIR, not all FAIR data needs to be open. This nuanced understanding is essential for researchers navigating the complex landscape of data sharing, particularly when dealing with sensitive or confidential information.

The FAIR principles, making data Findable, Accessible, Interoperable, and Reusable, were positioned as the foundation for effective research data management. These principles emphasise machine-actionability, enabling computational systems to discover, access, and reuse data with minimal human intervention, thereby maximising the value and impact of research investments.

Legal Framework and National Requirements

A significant portion of the presentation addressed Romania’s evolving legal landscape for open data and research data management. Law 179/2022, which transposes EU Directive 2019/1024 on open data and reuse of public sector information, establishes the legal foundation for Romania’s open data initiatives.

Key provisions of Law 179/2022 include:

  • Mandatory open access to publicly funded research data
  • Implementation of the “as open as possible, as closed as necessary” principle
  • FAIR principles compliance for all research data
  • Commercial and non-commercial reuse permissions
  • Creative Commons licensing requirements (CC BY 4.0 or CC0)

Romania’s Strategic Framework: The White Book

The webinar highlighted Romania’s comprehensive strategic approach through the White Book on the Transition to Open Science (2023-2030). This strategic document, developed through extensive consultation with the research community, outlines eight strategic objectives for Romania’s open science transition.

Strategic Objective 2, specifically addressing research data management, includes ambitious actions:

  • Mandatory DMP development and updates for all publicly funded research
  • Open access to research data through trusted digital repositories
  • Open access metadata with CC0 licensing
  • FAIR compliance with persistent identifiers (PIDs)
  • Eligible RDM costs in research funding
  • Comprehensive management of other research outputs including software, models, and algorithms

Practical Implementation and Tools

The webinar provided concrete guidance on DMP creation, utilising the Science Europe template as the foundation. The template addresses six core requirements:

  1. Data Description and Collection: Detailing data types, formats, and sources
  2. Documentation and Quality Assurance: Ensuring comprehensive metadata and quality control
  3. Storage and Backup: Establishing secure data storage during research
  4. Legal and Ethical Requirements: Addressing GDPR, intellectual property, and ethical considerations
  5. Data Sharing and Long-term Preservation: Planning for data accessibility and archival
  6. Responsibilities and Resources: Defining roles and allocating necessary resources

Digital Platforms and Resources

The speakers recommended several platforms for DMP development and data management:

For DMP Creation:

  • DMPonline
  • Argos (OpenAIRE)
  • RDMO
  • EasyDMP

For Data Repository Selection:

  • re3data & OpenDOAR (general repositories)
  • CESSDA (social sciences)
  • DARIAH (digital humanities)
  • ELIXIR (life sciences)

For Open Science Community Engagement:

  • BrainMap Open Science Community
  • UEFISCDI Open Science Network
  • Creative Commons resources
  • OpenAIRE platform

Impact on Romanian Research Funding

The webinar revealed that UEFISCDI is actively piloting DMP requirements across several major funding programmes:

  • Centres of Excellence (CoE)
  • Complex Frontier Research Projects (PCCF)
  • “Challenges – Change” Programme (PPS)

These programmes represent Romania’s most significant research investments, and the inclusion of mandatory FAIR DMPs signals a fundamental shift in how research quality and impact are evaluated.

Community Feedback and Future Directions

The final segment of the webinar facilitated crucial dialogue between UEFISCDI experts and the research community. Participants provided valuable feedback on DMP implementation challenges, including:

  • Technical infrastructure requirements for FAIR data management
  • Training needs for researchers and data stewards
  • Resource allocation for effective RDM implementation
  • Institutional support mechanisms

This feedback collection process exemplifies OPUS’s stakeholder-driven approach, ensuring that policy development remains grounded in practical research realities.

Romania’s Open Science Leadership

The June 16 webinar represented more than just an information session; it marked Romania’s commitment to leading Europe’s open science transition. By systematically implementing FAIR DMPs across its research funding portfolio, Romania is positioning itself as a model for other European countries navigating similar transitions.

The success of this initiative will largely depend on continued collaboration between funding agencies, research institutions, and the broader scientific community. UEFISCDI’s approach, combining policy development with practical implementation support, provides a blueprint for effective open science adoption.

As Romania moves toward its 2030 vision outlined in the White Book, the research community can expect continued evolution in assessment criteria, funding requirements, and support mechanisms. The foundation established through the OPUS project and initiatives like this webinar ensures that Romania’s researchers will be well-equipped to thrive in an increasingly open and collaborative global research environment.

The transition to open science is not merely a policy requirement. It represents a fundamental shift toward more transparent, collaborative, and impactful research that serves both the scientific community and society at large.

Final Lithuanian OPUS pilot event in Vilnius
Final Lithuanian OPUS pilot event in Vilnius 1024 683 Open and Universal Science (OPUS) Project

On 19 June 2025, the final Lithuanian event of the OPUS (Open and Universal Science) project took place in Vilnius at the Research Council of Lithuania (LMT). The gathering brought together researchers from various fields, institutional representatives, policymakers, and experts to review the outcomes, challenges, and future actions of their shared nine-month pilot journey, aimed at strengthening open science culture in Lithuania.

Dr Vaiva Priudokienė, Deputy Chair of the LMT, who has supported the project from its earliest days, emphasised that OPUS is not merely a pilot initiative, but a platform for long-term change grounded in cross-sectoral collaboration, knowledge sharing, and personal leadership.

OPUS Results: Measurable Progress and an Invitation to Continue Together

According to project coordinator Julija Baniukevič, OPUS is more than a pilot project. “It was a community-building process that not only succeeded in promoting the implementation of open science, but also opened space for an open, respectful dialogue about fundamental challenges in researchers’ daily lives.” From the very first steps, OPUS sought not only to educate and unite, but also to verify whether we are ready for change that requires both individual leadership and systemic support,” said J. Baniukevič.

Presenting the project’s progress and achievements, the coordinator emphasised that one of the most notable successes was the formation, in a short time, of an active community committed to open science. During discussions, the most pressing issues were identified and solutions sought together. This community are more than participants: these are people who not only learnt but also took on the role of teachers themselves.

Education was one of OPUS’s fundamental pillars. During the project, 21 open science trainers were prepared, researchers who conducted training for colleagues in their institutions, thereby reaching and training more than 200 researchers across Lithuania. This model allowed not only the dissemination of knowledge but also the creation of a trust-based learning network.

“Community engagement exceeded the project’s boundaries: more than 80 per cent of participants initiated discussions in their institutions, about data management, publication strategies, ethical challenges. These discussions showed that when space is provided for dialogue, not only questions arise, but also ideas, courage, and motivation to act,” stated J. Baniukevič.

All of this demonstrates that OPUS succeeded not only in mobilising open science topics but also in showing that Lithuania has a mature, responsible, and action-ready community. A community that, when working together, can move mountains.

Diverse Partnership and Open Conversations

The final event brought together representatives from various fields, arriving not only from different Lithuanian cities but also from very different segments of the science and policy landscape.

“One of OPUS project’s greatest achievements is the creation of interdisciplinary and inter-agency partnerships. In this way, the event not only shared practical insights but also reviewed different institutions’ experiences and challenges, thus creating an exceptionally rich field of shared learning,” said J. Baniukevič.

At the event, Dr Eglė Ozolinčiūtė from the Academic Ethics Control Service provided a particularly important ethical and contextual perspective on open science, data management plans, and citizen science initiatives, emphasising that openness is not merely a technical action but a cultural attitude closely linked to academic integrity, responsibility, and trust between science and society.

Dr Ieva Cesevičiūtė from Kaunas University of Technology (KTU) Library familiarised participants with Lithuanian scientific data repositories, how to choose and use them, emphasising practical aspects that both researchers and institutions encounter.

The project coordination experience and synthesis with European open science initiatives were presented by OPUS project coordinator J. Baniukevič and Gerinta Raguckaitė from LMT, highlighting the importance of synergy between political decisions and practical changes.

Vilnius University scientists Dr Justas Dapkūnas and Dr Jonas Eimontas also shared their acquired experience and practical insights, discussing publication practices and research registration experience, emerging challenges, and possible solutions. Dr Jolita Vveinhardt from Vytautas Magnus University raised fundamental questions about researchers’ motivation: what do they gain from open science? What impact do changes have on their daily work and productivity?

Meanwhile, Dr Petras Prakas from the Nature Research Centre presented how data opening principles can be applied within institutions—discussing not only theoretical aspirations but also very concrete practices.

Other Lithuanian scientists actively participated in discussions: Dr Jurgita Lazauskaitė-Zabielskė, Dr Tomas Grinys (Vilnius University), Dr Lina Draudvilienė, Dr Rytis Maskeliūnas (Kaunas University of Technology), as well as Dr Karolina Barčauskaitė (Lithuanian Research Centre for Agriculture and Forestry) and Dr Giedrius Steponaitis (Lithuanian University of Health Sciences), whose insights complemented the overall picture with perspectives and experiences from different scientific fields. The discussions were significantly enriched by OPUS advisory group members—Gerinta Raguckaitė, Dr Justina Rukšnaitė, and Dr Eglė Celešienė, who consistently accompanied the project throughout the entire process and actively engaged in deliberations about indicator application, researcher evaluation, and future open science policy directions. Their participation strengthened the discussions’ connection with broader strategic positions and helped build bridges between practice and policy.

This diverse participant composition allowed discussions to be not only enriched but also brought closer to reality, clarifying the most pressing challenges faced by both different institutions and individual research groups. It was precisely this inclusive, open format that helped the OPUS event become not merely a presentation of results but an authentic, vibrant community meeting where ideas, experiences, and questions became shared action potential.

Challenges Identified by the Community Itself

Discussions about open science development in Lithuania revealed several key challenges that must be addressed to achieve sustainable and effective implementation of this initiative.

One of the main issues is uneven data sharing traditions across different scientific fields. What has long been customary for one discipline still remains novel for another. Therefore, it is important to create more flexible solutions adapted to specific scientific areas rather than applying uniform standards to everyone.

Participants also emphasised the necessity of clear evaluation guidelines: how will we assess the provision, use, and sharing of open data? Will this become a formal criterion for evaluating researchers’ work results?

Considerable attention was also given to the lack of knowledge about licences, data management, and secondary use. It was noted that consistent education is needed in these areas, involving not only researchers but also administrative staff and policymakers.

A broader question arising from the discussion concerned science and business interaction—how to ensure that open science values are properly utilised and not used unilaterally or inappropriately.

Finally, participants spoke about personal attitudes towards change. The most frequently asked and most important question “What’s in it for me?” showed that beyond systemic solutions, researchers’ daily lives must also be considered. Open science must not only provide new opportunities but also facilitate scientists’ work, increase their visibility, and provide additional value to society.

LMT believes that such discussions as these are very important and will continue, and the OPUS project has provided a solid foundation for further changes in the open science field in Lithuania.

The OPUS project is implemented from 1 September 2022 to 31 August 2025. The project consortium includes 18 organisations actively operating at the European level and representing the full spectrum of open science components. Among them is LMT, which conducts pilot activities involving scientists from 23 research projects from 6 science and study institutions (VU, KTU, LSMU, VDU, GTC, and LAMMC). The aim of this stage, running from 1 October 2024 to 30 June 2025, is to coordinate and support measures for researcher evaluation reform at RPOs and RFOs, transitioning to a model that encourages the active application of various open science practices.

Source: RCL

OPUS Consultation on Researcher Assessment
OPUS Consultation on Researcher Assessment 960 653 Open and Universal Science (OPUS) Project

On 10 June 2025, the OPUS project brought together researchers, research managers, funders, and policy makers to collaboratively shape the revision of the OPUS Researcher Assessment Framework (RAF). The workshop aimed to engage in discussion and collect feedback from the participants on the first version of the RAF. The workshop was hosted by the Marie Curie Alumni Association and led by Gareth O’Neill from Technopolis Group. The workshop resulted in  a lively and engaging interaction  with the participants who acknowledged the value of the RAF and provided critical suggestions to revise the RAF.

OPUS Researcher Assessment Framework (RAF)

The RAF consists of a comprehensive framework of indicators which are designed to provide research-performing and research-funding organisations with options to expand their researcher assessment systems beyond the traditional focus on publications and citations to include a much wider diversity of activities conducted by researchers. The RAF is further divided into a generic set of indicators covering all possible activities by researchers and a more specific  set of indicators to incentivise and reward Open Science.

The RAF is structured around four key domains of activities relevant for researchers:

  • The research domain covers writing proposals, developing methods, managing data, developing software, writing publications, creating research  materials, and peer reviewing research outputs.
  • The education domain includes  developing educational courses, creating educational resources, teaching students, supervising students, and developing personal and professional skills.
  • The leadership domain focuses on leading personnel and projects, taking management roles within the organisation, and being recognised as an expert and for significant contributions.
  • The valorisation domain lastly includes science communication to peer and general audiences, academic and intersectoral collaboration and engagement, and exploitation and entrepreneurship.

The RAF offers organisations an array of indicators from which to select and include in their researcher assessment systems. The indicators can be deployed qualitatively or quantitatively whereby the OPUS project encourages a combination of a qualitative and responsible quantitative approach to assessment. The RAF divides the indicators into processes, outputs, and outcomes as a researcher may find themself in the middle of a research activity, or have realised an output, or have realised an outcome depending on the moment of their assessment. The selection, refinement, and prioritisation of indicators is left to the organisations which will tailor and implement the indicators according to their own strategic interests and needs and existing assessment processes.

Highlights from the OPUS Consultation Meeting

The participants raised a number of important points with regard generally to researcher assessment and specifically to the RAF in the consultation meeting:

  • Participants stressed the importance of qualitative assessment and cited the lack of homogeneity across countries and disciplines as well as a lack of considering a global picture of skills development.
  • There was a call for clearer definitions of assessment criteria, especially in academic hiring, where unwritten rules and lack of transparency around the assessment procedure continue to create inequity.
  • We should acknowledge different types of personalities and contributions and embed these reforms into structural processes, such as habilitations, theses evaluations, and postdoc funding decisions
  • The framework should be applicable to all researcher levels from early-career researchers (R1-R2) to senior researchers (R3-R4) and be applied flexibly depending on the researcher career stage and context,
  • The term ‘valorisation’ is not well known to researchers and seems to include a mix of science communication, stakeholder engagement, and societal impact activities, although no better alternatives for the category were proposed.
  • The term ‘community service’ (which refers to researcher contributions such as peer review and mentorship) are not grouped together but are spread across the four main categories of activities in the framework.
  • Participants noted the distinction between supervising students (such as bachelor, master, and PhD students) which falls under the category of education versus supervising staff (such as PhD and other staff) which falls under the category of leadership in the framework.
  • The term ‘intersectoral engagement’ is not so clear and seems to consist of different types of engagement activities (such as interactions, mobility, and collaboration) yet only applies across sectors while intrasectoral engagement within the academic sector is also important.
  • The framework recognises actual effort by researchers and allows for the recognition of contributions by early-career researchers (such as proposal writing, project management, and peer reviews) which are often hidden.
  • The implementation of the framework and researcher assessment procedure should not be a burden on researchers and should be kept simple and automated as much as possible to support researchers in their assessment.
  • The framework is not directly linked to ResearchComp but could in its implementation be linked to the skills/competences recognised in ResearchComp such as in the skills development activities recognised in the framework.
  • The framework should connect to the values and evolving priorities of the relevant organisations whereby the selection and refinement of indicators from the framework may reflect these values and priorities.
  • Participants noted that the framework should directly link to the values and principles of the Coalition for Advancing Research Assessment (CoARA) as well as the evolving strategic interests and needs of the organisations.
  • The reform of research assessment may be impeded by national regulations which set predefined criteria for assessment and which must be followed and cannot be reformed by research-performing and research-funding organisations.
  • The framework could be supported by a set of guidelines to explain how to implement the framework as well as examples of possible implementation for research-performing and research-funding organisations.
  • The framework recognises a cross-cutting dimension for Open Science and could include a cross-cutting dimension to recognise teamwork and collaboration as well as equality, diversity, and inclusion.
  • The framework is not focused on industry but the inclusion of indicators which are relevant for industry could help researchers in developing themselves professionally and collaborating with or even transitioning to industry.
  • Candidates under assessment should be considered not only for the diversity of their research, education, leadership, and valorisation activities but also for their own specific skills/competences as well as their specific personalities  

Next Steps for OPUS

The first version of the RAF will be revised and updated according to the lessons learned by pilot organisations testing the RAF in the project as well as the feedback gathered during the consultation and will be published in the OPUS Project Zenodo Community.

Roundtable discussion on the Role of RPOs and RFOs in Open Science Monitoring
Roundtable discussion on the Role of RPOs and RFOs in Open Science Monitoring 520 226 Open and Universal Science (OPUS) Project

A Joint Session by: UNESCO, OPUS, OpenAIRE and OSMI

As Open Science frameworks proliferate across Europe and beyond, a critical question emerges: how do we translate monitoring policies into real, sustainable institutional practices?

In this high-stakes roundtable, five leading initiatives (each shaping the future of Open Science) will come together to explore the practical realities behind policy implementation. The session will address the pressures facing research-performing (RPO) and research-funding organisations (RFO) as they navigate rising demands for accountability, transparency, and alignment.

Key topics include:

  • Institutional-level engagement with EU monitoring recommendations
  • Feasibility and cost concerns for implementation
  • Researcher engagement and resistance
  • Gaps between policy vision and operational capacity
  • Alignment with national and global monitoring efforts

The session will move beyond theory to surface lived experiences, assumptions, and barriers. Most importantly, active audience participation is strongly encouraged, especially from those in RPOs and RFOs. Your voice will help uncover what truly enables or hinders monitoring integration on the ground.

Event Details:

14:15 – 14:30
OPUS Introduction and Roundtable Discussion on the Role of RPOs and RFOs in Open Science Monitoring

14:30–16:30
The Role of RPOs and RFOs in Open Science Monitoring:
Connecting Institutional Practices to Policy Aggregation
Joint session by UNESCO, OPUS, OpenAIRE and OSMI
Moderated by: Gareth O’Neill (Principal Consultant on Open Science, Technopolis Group Belgium, an OPUS consortium partner)

Roundtable Contributors:

  • Manuel Ricardo Galindo Moreno, Science, Technology, and Innovation Policy, UNESCO
  • Natalia Manola, CEO, OpenAIRE AMKE
  • Laetitia Bracco, OSMI Co-coordinator, Université de Lorraine
  • Alina Irimia, Open science projects coordinator at UEFISCDI-Romania, OPUS pilot RFO organisation
  • Nataša Jakominić Marot, Head of Research and Innovation at the University of Rijeka (Croatia), OPUS pilot RPO organisation

As Open Science monitoring frameworks multiply across Europe and globally, research-performing and funding organisations (RPOs and RFOs) are increasingly expected to implement, report on, and align with evolving expectations. However, the practical realities of translating policy into institutional practice remain complex.

This roundtable brings together contributors from international and European initiatives to reflect on institutional uptake of Open Science monitoring and researcher assessment frameworks. Each speaker will provide a brief input grounded in their context, followed by a moderated discussion addressing barriers, synergies, and lessons learnt.

16:30 – 17:00
Wrap-up and Closing
Presented by UNESCO

17:00
Networking & Cocktail Reception


Register also for the OPUS Final Conference: July 9-10, 2025

The OPUS (Open and Universal Science Project) Consortium will hold its Final Conference at the UNESCO International Institute of Educational Planning (IIEP) in Paris, bringing together key stakeholders to discuss the future of Open Science and responsible research practices in Europe.

For more on the OPUS Final Conference (July 9-10, 2025), visit here.

Linking Open Science and Gender Equality: Report from the OPUS Interactive Workshop
Linking Open Science and Gender Equality: Report from the OPUS Interactive Workshop 1024 717 Open and Universal Science (OPUS) Project

Linking Open Science and Gender Equality: Comprehensive Analysis from the OPUS Interactive Workshop
This report examines the complex relationship between open science and gender equality based on insights from the OPUS online workshop: “Gender Equality (GE) and Open Science (OS) in Research(er) Assessment (RA)” organised by VU in cooperation with UNESCO, on 27 May 2025.

This public online workshop was designed to raise awareness of the principal challenges surrounding OS and GE, and to explore potential solutions through open discussion with key experts, project partners, cohorts, and external participants: researchers, professionals, and experts in OS and/or GE, science policymakers, HR managers in research-performing organisations (RPOs), and programme managers in research-funding organisations (RFOs).

These objectives aligned closely with the broader goals of OPUS: to identify critical issues affecting OS and GE, and to propose actionable solutions at national, organisational, and individual/researcher levels.

The findings reveal that whilst open science can serve as a predictor for gender equality in organisations, the relationship is not always causal or unambiguous, being heavily dependent on cultural and institutional contexts. Key evidence demonstrates that when organisational culture is unfavourable to gender equality, open science initiatives may not predict positive changes and could even correlate negatively with equality outcomes.

Workshop Overview and Strategic Objectives
The workshop served as a platform for critical reflection on European Commission recommendations for research evaluation whilst identifying implementation barriers and co-developing practical recommendations.

A total of 136 stakeholders registered for the workshop, with 134 completing the accompanying questionnaire. Among the respondents, 103 were from RPOs14 from RFOs, and 17 represented other entities such as non-profit organisations, government ministries, or industry.

Over half of the participants (60%) reported experience with open access publishing, whilst around 49% had been involved in evaluating OS practices or publishing open data. However, a significant proportion, ranging from 21% to 58% depending on the specific activity — reported no experience with various aspects of OS.

Notably, 49% of respondents, many of whom came from RPOs, had no experience in developing open science policies. This finding is particularly concerning, as it highlights a substantial gap between the aspirational discourse surrounding OS, which emphasises democratisation, participation, and co-creation, and its actual implementation in practice.

Additionally, 58% of respondents lacked experience with OS tools (such as repositories), indicating a clear need for increased visibility, training, and support in these areas.

This finding is particularly relevant given that 35% of participants came from engineering and technology, and 19% from natural sciences — fields where gender disparities persist, as highlighted in the EU She Figures report. However, further analysis of the findings from a gender perspective was not possible, as 83% of the participants were female.

When asked about their plans to engage in OS activities in the near future, over 80% of respondents indicated an intention to continue with open access publishing, 75% planned to publish open data, and 68% aimed to participate in OS evaluation processes.

However, certain areas showed less engagement: 30% of respondents did not plan to create OS tools24% did not intend to develop OS policies, and 22% had no plans to conduct open and reproducible research.

Whilst these intentions reflect encouraging trends, without supportive research assessment frameworks in place, such aspirations may remain aspirational rather than actionable. Conversely, the “no” responses highlight areas where training, capacity building, and awareness-raising are needed to foster more balanced and comprehensive adoption of OS practices.

To address these gaps effectively, further investigation into the barriers at national and institutional levels is recommended. In addition, disaggregated analyses — considering gender and other social characteristics of institutional members (cohorts) — could help generate context-specific recommendations and inform targeted interventions.

Workshop Structure and Activities

The workshop began by presenting research findings and evidence for integrating GE into OS activities, discussing disciplinary and cultural variations in the area, and reflecting on global and European policy contexts.

Participants then joined three thematic groups where they shared their opinions and experiences related to ERAC recommendations on:

(a) How to foster the diversity of open research and innovation ecosystems, using a variety of evaluation methods and indicators to better recognise the diversity of research outputs and processes, as well as the diversity of researchers themselves;

(b) How to foster transparency in research evaluation as well as trustworthiness in the added value of OS and GE; and

(c) How to create a virtuous circle between training and evaluation in the context of OS and GE.

Primary Workshop Objectives
The workshop established four core objectives that guided all discussions and outcomes:
● Critical Reflection: Examining key recommendations for research evaluation to promote both open science and gender equality
● Barrier Identification: Mapping obstacles to implementation at policy, organisational, and individual levels
● Practical Development: Creating actionable recommendations for the European Commission and research organisations
● Integration Framework: Embedding openness, transparency, and equality into scientific practice

OPUS Project
The OPUS project, funded by the European Commission’s Horizon programme, employs a comprehensive three-tiered stakeholder engagement approach:
● Diverse Consortium: Comprising researcher organisations, research-performing organisations (RPOs), research-funding organisations (RFOs), industry organisations, and open science experts
● Community Validation: Extensive stakeholder engagement sessions to gather input and validate key deliverables
● Expert Oversight: Advisory Board providing specialist guidance and maintaining connections with the broader research community

Evidence Base: Complex Interactions Between Open Science and Gender Equality
Research Findings and Evidence Blocks
Raminta Pučėtaitė from Vilnius University presented three distinct evidence blocks that challenge simplistic assumptions about open science automatically promoting gender equality:

Block 1: Predictive Relationship
Open science can predict gender equality in organisations by enabling data-driven frameworks and creating more transparent processes for evaluating individual academic merits. This perspective emphasises how open science practices can reduce bias in evaluation systems through increased transparency and accountability.

Block 2: Context-Dependent Limitations
Cultural and institutional context plays a determining role in outcomes. When organisational or national culture is unfavourable to gender equality, open science initiatives do not predict positive changes; rather, they may be negatively interrelated with equality goals. This context-dependency suggests that open science alone cannot overcome deeply embedded cultural barriers.

Block 3: Limited Implementation Effects
Open science may have no effect on gender equality when implemented in restricted ways, particularly when understood solely as open access publishing without broader institutional transformation. This finding underscores the importance of holistic approaches addressing systemic issues.

Disciplinary and Cultural Variations
The workshop highlighted significant variations across different contexts¹:
● Disciplinary Patterns: Gender patterns in open science participation vary by disciplinary field and context
● Publication Behaviours: Women may publish more in open access journals in some settings, whilst men are more likely to share primary data in others
● Trust Factors: Willingness to engage in open science activities is shaped by interpersonal and cultural factors specific to each environment

Global and European Policy Contexts

UNESCO’s Global Perspective on Gender Equality

UNESCO representative Juliana Chaves Chaparro provided crucial global context, emphasising persistent gender equality challenges despite decades of policy attention:

Key Global Statistics:
● Only one-third of researchers worldwide are women
● Particularly severe underrepresentation in physics, artificial intelligence, and emerging technologies
● Women comprise only 12% of members in national science academies globally
● Women face greater challenges obtaining research grants and project coordination roles
● Severe underrepresentation in STEM fields across both developed and developing countries

The Persistent “Leaky Pipeline”:
The workshop highlighted how women’s participation in research decreases dramatically at higher career stages, with UNESCO data revealing concerning trends across the research ecosystem.

Human Rights Framework:
UNESCO’s approach frames participation in science as a fundamental human right rather than merely a policy preference, calling for:
● Ambitious measures to remediate past inequalities
● Equitable and open access to scientific literature and data
● Transparent, gender-sensitive performance appraisals considering all aspects of academic work
● Recognition of family-related career interruptions

European Union Policy Implementation

Aurelija Novelskaite from Vilnius University outlined the EU context, revealing significant implementation gaps:

Horizon Europe Gender Equality Framework:
● Mandatory gender equality plans introduced as eligibility criteria in 2020
● Integration of gender dimensions into research content and innovation processes
● Promotion of gender balance among researchers, expert groups, and decision-making bodies

Implementation Challenges:
● Many institutions treat requirements as “box-ticking exercises” lacking genuine implementation
● Gender equality plans often created as formal documents without substantive changes in organisation’s management practices
● Significant variation in implementation quality across member states

Critical Policy Gap:
Research conducted by the Gender Action project found that most European Commission policy documents focusing on open science fail to provide directions, recommendations, or guidelines for integrating gender equality considerations into open science activities.

Systemic Barriers and Implementation Challenges

Identified Barriers to Integration
Workshop participants identified several interconnected barriers preventing effective integration of open science and gender equality:

Policy Development Barriers:
● Low participation by researchers and funding organisation representatives in shaping open science policies
● Minimal involvement of practitioners in national and supranational research policy development
● Limited awareness of open science practices at early stages of the scientific life cycle

Institutional Implementation Barriers:
● Superficial compliance with formal gender equality plans without real action
● Cultural resistance in contexts where open science and gender equality are not mutually reinforcing
● Inadequate integration of gender equality roles in open science action plan development

Data and Evidence Barriers:
● Lack of disaggregated data hindering targeted interventions
● Insufficient sex- and gender-segregated indicators in assessment frameworks
● Limited monitoring of differential impacts across various groups and contexts

Intersectional Challenges:
● Underrepresented groups beyond women face additional barriers
● Migrants, LGBTQI individuals, and people with disabilities encounter specific obstacles
● Multiple identity factors compound existing inequalities

Early Career Researcher Specific Barriers:
The workshop identified unique challenges facing early career researchers (ECRs):
● Vulnerability Concerns: Fear of retribution for critical open peer reviews
● Experience Limitations: Lack of experience in predicting research workflows for pre-registration
● Competitive Pressures: Concerns about exposure and vulnerability in competitive academic environments
● Resource Constraints: Prohibitively high time commitments required for open science practices

Comprehensive Recommendations Framework

Gender Action Project Integration Recommendations
The workshop examined specific recommendations developed by the Gender Action project, organised into five priority areas:

1. Gender Mainstreaming and Policy Synergies

  • Systematically integrate gender aspects across all prioritised policy areas
  • Involve more gender experts in open science activities
  • Ensure gender considerations are embedded in policy development processes

2. Knowledge and Awareness Building

  • Collect and analyse disaggregated data on participation by gender and other identities
  • Understand participation patterns and identify barriers
  • Develop targeted interventions to address inequalities

3. Evaluation and Assessment Practices

  • Explore how new metrics impact different genders across career stages, disciplines, and cultures
  • Adopt multi-dimensional, transparent evaluation criteria
  • Ensure transparency and openness in all evaluation practices and procedures

4. Research Practices Enhancement

  • Encourage sharing of preprints on gender-related research
  • Implement appropriate management of sex and gender data
  • Make gender-related research more visible whilst respecting privacy and ethical considerations

5. Innovation Process Development

  • Develop participatory innovation projects guaranteeing gender diversity
  • Integrate gender considerations into research content
  • Strive for equal distribution in research teams and leadership positions

Practical Implementation Strategies

Workshop breakout sessions identified concrete implementation approaches:

Institutional Integration Measures:

  • Involve gender equality officers in designing and implementing open science action plans
  • Develop sex- and gender-disaggregated indicators for all research assessment frameworks
  • Promote visibility of women’s achievements through targeted communication and success stories

Training and Support Programmes:

  • Provide targeted training for researchers and research evaluators on open science and gender equality
  • Address work-life balance and family-care reconciliation as key issues
  • Tackle self-confidence barriers among women and underrepresented groups

Collaboration and Networking:

  • Foster collaboration with civil society organisations, women’s groups, and policymakers
  • Use stakeholder mapping to identify and leverage supportive networks
  • Broaden the impact of gender equality initiatives through strategic partnerships

Project Management and Implementation Framework

Task Prioritisation Using Eisenhower Matrix

Based on workshop outcomes, implementation priorities can be categorised as follows:

Urgent and Important:

  • Addressing superficial compliance with gender equality requirements
  • Developing disaggregated data collection systems
  • Creating targeted support for early career researchers

Important but Not Urgent:

  • Building long-term cultural change initiatives
  • Establishing comprehensive training programmes
  • Developing intersectional policy approaches

Urgent but Less Important:

  • Updating existing policy documents to include gender considerations
  • Creating immediate communication campaigns
  • Establishing short-term monitoring systems

Neither Urgent nor Important:

  • Routine administrative updates
  • Non-essential documentation processes
  • Peripheral stakeholder engagement activities

Collaboration Workflows and Governance

The workshop established clear role-based access controls for implementation:

Research Performing Organisations (RPOs):

  • Review policies and implicit framings regarding competition versus cooperation
  • Implement employment policies addressing equity and equality
  • Create training opportunities and encourage institutional learning

Research Funding Organisations (RFOs):

  • Review incentive structures built into funding mechanisms
  • Examine structural incentives embedded in funding procedures
  • Develop funding criteria recognising diverse research outputs

Policymakers:

  • Coordinate policy mechanisms supporting gender equality across all actions
  • Create strategies to counteract gender-based violence in research environments
  • Establish intersectional policy approaches addressing diverse backgrounds and career stages

Assessment Reform and Future Implications

Research Assessment Transformation

The workshop’s discussions have significant implications for ongoing research assessment reform efforts across Europe and globally. The Agreement on Reforming Research Assessment, signed by numerous European organisations in 2021, provides a framework for moving beyond traditional metrics-focused evaluation towards comprehensive approaches recognising diverse research outputs.

Evidence from OPUS Project Pilot Activities:

The OPUS project’s pilot activities in five organisations—three Research Performing Organisations (Nova University Lisbon, University of Rijeka, and University of Cyprus) and two Research Funding Organisations from Lithuania and Romania—are testing interventions and indicators integrating open science practices with broader assessment reform goals.

Critical Assessment Statistics:

Recent European Commission reports reveal urgent reform needs:

  • 98% of EU research publications fail to integrate a gender dimension
  • Women represent only 34% of EU researchers overall
  • Current assessment systems fail to incentivise comprehensive open science practices or inclusive research approaches

Moving Beyond Superficial Compliance
The workshop findings suggest that successful integration requires fundamental transformation rather than superficial compliance:

  • Cultural Transformation: Moving beyond “tick-box” compliance towards genuine institutional change
  • Structural Change: Addressing interconnected barriers facing researchers from underrepresented groups
  • Contextual Approaches: Recognising that different disciplinary and cultural contexts require tailored solutions

Future Directions and Policy Development

Immediate Policy Implications

As the OPUS project approaches completion in August 2025, the workshop outcomes will inform final policy briefs and recommendations:

Policy Framework Requirements:

  • Explicit addressing of open science and gender equality intersection
  • Cultural and institutional transformation focus rather than mere compliance
  • Assessment system redesign recognising full spectrum of research activities

Implementation Support Focus:

  • Addressing systematic biases disadvantaging women and underrepresented groups
  • Moving beyond publication-focused metrics towards comprehensive evaluation approaches
  • Recognising collaboration, transparency, societal engagement, and diverse research contributions

Long-term Strategic Directions

Monitoring and Evaluation Systems:

  • Incorporate disaggregated data collection enabling understanding of differential impacts
  • Assess whether interventions achieve intended goals or exacerbate existing inequalities
  • Develop evidence-based approaches to policy refinement

Training and Support Programme Development:

  • Address specific needs of early career researchers and underrepresented groups
  • Recognise unequal distribution of barriers to open science participation
  • Implement targeted interventions addressing path dependencies limiting future participation

Assessment System Redesign:

  • Value the full spectrum of research activities whilst addressing systematic biases
  • Move beyond publication-focused metrics towards comprehensive evaluation approaches
  • Recognise collaboration, transparency, societal engagement, and diverse forms of research contribution

Strategic Recommendations

The OPUS Interactive Workshop definitively established that linking open science and gender equality requires substantially more than policy statements. The evidence presented demonstrates that genuine integration demands:

  • Context-Sensitive Action: Recognising that relationships between open science and gender equality vary significantly across cultural and institutional contexts
  • Genuine Implementation: Moving beyond superficial compliance towards fundamental cultural and structural transformation
  • Comprehensive Approaches: Addressing interconnected barriers rather than treating open science and gender equality as separate initiatives

Strategic Implementation Framework
Based on workshop findings, successful implementation requires a multi-level approach:

Macro Level Actions:

  • Embedding gender and diversity considerations at all stages of research assessment and policy development
  • Creating policy coordination mechanisms supporting gender equality across all research actions
  • Establishing evidence-based monitoring systems with disaggregated data collection

Meso Level Actions:

  • Recognising and rewarding broader spectrum of research outputs and societal contributions
  • Implementing comprehensive institutional change rather than superficial compliance
  • Developing targeted support programmes addressing specific barrier patterns

Micro Level Actions:

  • Providing comprehensive training for both researchers and evaluators
  • Improving data collection and visibility of achievements across all groups
  • Supporting grassroots initiatives and peer-to-peer support networks

The workshop’s comprehensive analysis provides a robust foundation for advancing both open science and gender equality objectives through integrated, evidence-based approaches that address the complex realities of contemporary research environments whilst promoting meaningful institutional transformation.

Privacy Preferences

When you visit our website, it may store information through your browser from specific services, usually in the form of cookies. Our Privacy Policy can be read here.

Here you can change your Privacy preferences. It is worth noting that blocking some types of cookies may impact your experience on our website and the services we are able to offer.

Click to enable/disable Google Analytics tracking code.
Click to enable/disable Google Fonts.
Click to enable/disable Google Maps.
Click to enable/disable video embeds.
Our website uses cookies, mainly from 3rd party services. Define your Privacy Preferences and/or agree to our use of cookies.