OPUS Final Conference: New Approach to Research Evaluation
OPUS Final Conference: New Approach to Research Evaluation https://opusproject.eu/wp-content/uploads/2025/07/Participants2-1-1024x683.jpg 1024 683 Open and Universal Science (OPUS) Project Open and Universal Science (OPUS) Project https://opusproject.eu/wp-content/uploads/2025/07/Participants2-1-1024x683.jpgThe OPUS (Open and Universal Science) Consortium held its Final Conference at the UNESCO International Institute of Educational Planning (IIEP) in Paris on 9-10 July 2025, marking the culmination of a transformative journey to reform research assessment systems across Europe and beyond. This two-day event brought together researchers, policymakers, and institutional leaders to shape the future of Open Science and responsible research practices.
Welcome Remarks and Opening Perspectives
The conference started with welcome remarks from three distinguished leaders who established the importance of inclusive, responsible research ecosystems.
Corina Moya-Falcón, Senior R&D Project Manager at PLOCAN and overall Coordinator of the OPUS consortium, formally welcomed delegates to the closing event of the Open and Universal Science (OPUS) project. Her speech opened the first thematic block of the day and provided an authoritative recap of the consortium’s journey, achievements, and next steps.

Over the past three years, the OPUS consortium has worked tirelessly to rethink how research and researchers are assessed, promoting practices that are more open, transparent and responsible.
What began as an ambitious initiative has become a solid, evidence-based contribution to shaping the future of research assessment in Europe. Our general objective has been to coordinate and support measures that reform the evaluation of research outputs and careers within research-performing and research-funding organisations. To achieve this goal, we have brought together a diverse group of partners: universities, funding agencies, research institutes and specialist expert bodies, from across the continent.
The project is organised around seven interlocking work packages, each addressing a critical facet of the open-science ecosystem. Work Package 1 mapped the state of the art, while Packages 2, 3 and 4 designed and tested interventions, defined indicators and metrics, and established robust monitoring processes within our pilot institutions.
Gustavo Merino, Director of UNESCO’s Social Policies Division, delivered keynote remarks positioning OPUS’s work within the global movement for equitable knowledge sharing.

Dr Merino, who has served as Director of Social Policies at UNESCO since January 2024, brought his extensive background in strategic planning, policy formulation, and international development to the conference. Previously serving in senior positions within the Mexican government, including Deputy Minister of Social and Human Development, and as Director of the Investment Center Division at the Food and Agriculture Organization (FAO), Merino emphasised UNESCO’s longstanding commitment to supporting Open Science as a driver of sustainable development and social progress.
Reflecting on 25 Years of the European Research Area: Insights from Rinske Van Den Berg
As the European Research Area (ERA) marks its 25th anniversary, it’s clear that the journey is far from over. Rinske Van Den Berg, Project Officer at the Research Executive Agency, shared her reflections on the evolving landscape of European research and the challenges that still lie ahead.

“The ERA agenda is not static. It evolves with the wishes of the Member States, who seek to further develop research, universities, research-performing organisations, and research funding organisations,” she explained. While some elements from previous ERA agendas remain, Van Den Berg emphasised that “these activities still require further implementation or alignment within the Member States. This is essential to facilitate researchers and higher education in Europe.”
Among the priorities, she highlighted two key areas: “Enabling open science and reforming research assessment remain central. We also continue to collaborate on connecting research infrastructures.” There’s a strong focus on gender equality and making research careers more attractive across the continent. “We are striving to connect activities from the European University Alliances with research and innovation, aiming to develop a fully aligned process within Europe. This involves linking local universities with regional and national authorities and connecting these to European policy decisions, forming a coherent triangle,” she said.
Looking ahead, Van Den Berg noted, “There are new actions on which the ERA will focus in the coming years. You will see further developments in areas such as equity in open science, which will require more attention, and in policy and science-for-policy ecosystems, as mentioned by Gustavo Merino of UNESCO. Another important area is the impact of artificial intelligence on the work and careers of researchers.”
She was quick to point out that none of this progress would be possible without the often-unsung heroes of research management. “All this work cannot be achieved without the support of research managers. Many of you are working hard on projects to improve the careers of research managers. From my experience in the Commission, about 80% of the people we met in meetings were not researchers themselves; they simply do not have the time. It takes a whole career to understand the policies developed in Brussels, the calls that are issued, and how to write proposals. This is not usually the work of the researcher alone, but of a dedicated team.”
“Behind every great researcher, in my view, there is a great team of people who assist, especially in matters of open science and research assessment. I am pleased that this remains a focus and is renewed on the agenda. We will continue to work together with our partners to improve the careers of research managers.”
Van Den Berg also touched on the fluidity of career paths in research. “Who in the room considers themselves a research manager rather than a researcher? I see that about 25% of you do, and that is telling. Researchers often move between roles during their careers: starting as master’s students, completing PhDs, teaching at university, perhaps taking time for family, managing teams, and sometimes returning to research. There is an overlap between the roles of researcher and research manager within individuals, and we must strive to make flexible careers possible, supported by the right assessment systems.”
On the policy front, she spoke about ongoing efforts to develop the AOS Federation, which will “enable the exchange and reuse of data, increasing the amount and productivity of FAIR research data in Europe. It is important that results are accessible, archived, and reusable, so we do not reinvent the wheel.”
Copyright and data ownership are also on the agenda. “The ERA policy agenda also seeks to develop ideas on copyright, particularly as it relates to open data. Researchers should be able to own their research where necessary. The impact of open science policies and practices on platforms such as the Open Science Intelligence Platform is also being considered.”
Van Den Berg identified the reform of research assessment and open science as the two main topics in the current ERA policy agenda. “The focus for OPUS partners in the coming decade will be on these areas, especially as we approach the final years of Horizon Europe and move towards Framework Programme 10. We are supporting and mapping changes in national frameworks and connecting them with individual research institutions. It is important not only to develop policy but to ensure real connections between institutional, national, and European developments.”
She concluded by stressing the importance of policy input from the research community: “It is very important for us, and especially for my colleagues in the Commission, to receive your policy input. I could not have presented these slides without your contributions. Policy is developed from all regions and institutions involved in research, innovation, and assessment. We want to hear from projects like yours about your evidence, what you have demonstrated, and what measures you would like us to consider in future work programmes. I appreciate that your policy briefs are integrated into your communication materials for OPUS.”
“One of the aims when we started this project was to be practical, ensuring the policy brief would help us draw together all the threads and learnings, so we can provide feedback to you and our colleagues at the Commission and UNESCO.”
As the ERA continues to evolve, it’s clear that collaboration, adaptability, and input from across the research community will remain vital to shaping the future of European science.
Main Outcomes and Key Exploitable Results
OPUS Researcher Assessment Framework (RAF)
New Approach to Research Evaluation
The centrepiece of the conference was the presentation of the OPUS Researcher Assessment Framework (RAF) by Gareth O’Neill from Technopolis Group Belgium (TGB) and Emma Day from Vitae. O’Neill, Principal Consultant on Open Science at Technopolis Group, specialises in the implementation of Open Science across Europe, particularly focusing on sustainable careers for researchers and research assessment reform.

Gareth O’Neill (TGB) on the OPUS Researcher Assessment Framework (RAF)
One of the main focuses was developing a framework of indicators to incentivise and reward open science. However, as we examined these indicators, it became clear that they are only one piece of the puzzle. Open science activities are not separate from other research activities; researchers are never solely engaged in open science, nor are they rewarded only for that. Therefore, we had to take a step back and consider the full range of activities researchers undertake, what they are currently assessed for, and what we would like them to be assessed for in the future—whether in project applications, job applications, or during ongoing employment. The open science component should play a major role, but it is not the only aspect.
We have not worked in isolation; there have been many steps to reach this point, and there are ongoing efforts. Comments on assessment and open science were present in the original Charter and Code for Researchers, but the revised version is more explicit. We have considered the DORA principles, the Leiden Manifesto for research metrics, and the work of the Open Science Policy Platform. Many colleagues, including Rinska and Rene, contributed to these efforts, particularly on indicators for research assessment.
There are numerous principles, such as the Hong Kong and COARA principles, so the question is which ones to use. The Coalition for Advancing Research Assessment is a major initiative aiming to reform research assessment and align organisations in a common direction. With over 750 members, it is challenging to reach agreement, but shared goals exist. We have monitored these principles and many available indicators as we developed our framework.
We established our own set of principles for the framework. We did not delve deeply into organisational values and principles, as these are usually already defined. Our focus was to provide indicators to support assessment and open science, essentially creating a toolbox. Our aim was to develop a comprehensive set of indicators, enough to capture all researcher activities, but not so many as to become unmanageable. We sought a balance, avoiding an overwhelming list of indicators.
The framework is designed to be applicable across countries, disciplines, and types of organisations, whether research-performing or research-funding. It is agnostic to context and can be applied at various levels. Importantly, we are not dealing with metrics per se, but with indicators that can be used qualitatively or quantitatively. Organisations are free to determine how to deploy them; we believe assessment should combine responsible quantitative metrics with clear qualitative guidance.
Our focus is predominantly on the assessment of individual researchers, not research quality or peer review. However, as research funding organisations often work on a project or principal investigator basis, the framework is also scalable to projects. It is not intended for monitoring large groups, faculties, or entire universities, though it could be adapted for such purposes if desired.
We sought to capture all main activities undertaken by researchers, not just publications or publication metrics such as journal impact factors, citations, or the h-index. While these remain important, we want to provide organisations with more options.
The research assessment framework considers all researcher activities, with a specific lens for open science, two sides of the same coin. Indicators are separated into processes, outputs, and outcomes. A researcher may be at different stages: undertaking an activity (process), producing a tangible result (output), or achieving short- to medium-term outcomes. We do not focus on long-term impact.
Indicators are phrased to be clear yet flexible, allowing organisations to interpret and adapt them. Organisations can select, refine, merge, or subdivide indicators as needed. We do not prioritise indicators; any ranking or weighting is left to the organisation. The result is a flexible framework.
Our final version identifies four key domains of researcher activities, tested with researchers:
- Research: Developing proposals, methods, methodologies, protocols, data (planning, collection, management, curation, archiving), software, publications, materials (laboratory specimens, artistic objects), and reviews (publications, data, proposals).
- Education: Developing course programmes, resources, teaching, supervision (bachelor’s, master’s, PhD students), and skills development.
- Leadership: Demonstrating leadership or management (staff, projects), holding organisational roles (team leader, institute director), participating in committees, advisory boards, panels, and receiving awards.
- Valorisation: Science communication and popularisation (written, physical, online, recorded), engagement (industry, public sector), exploitation of research results (intellectual property), and entrepreneurship.
The framework includes two cross-cutting dimensions:
- Openness: The Open Science Career Assessment Matrix (OSCAM 2) lenses all activities for open science.
- Teamwork/Collaboration: Recognising that most research is collaborative, though we have not created a separate framework for teamwork.
Equality, diversity, and inclusion is an overarching consideration, though not directly embedded in the indicators.
For example, in the domain of data, the process may involve developing a data management plan. Outputs are the completed plan or dataset. Outcomes include implementation, publication in repositories or journals, and indicators such as access, citation, or reuse. These can be measured quantitatively or qualitatively. Whether these steps are open depends on institutional policy.
These indicators are intended as building blocks; organisations already have frameworks and systems in place. Our indicators and ideas can help them revise and reform their systems to be more comprehensive and inclusive of open science. Deployment is left to each organisation.
Emma Day (Vitae) on the interventions
Our task was to consider interventions for implementing the framework. There are many principles and frameworks in this area, but they mean little without action.
We debated what interventions were needed and, in the end, developed ten general interventions for any organisation looking to implement the frameworks. These are starting points to consider before implementing either OSCAM2 or the RAF:
- Collect existing best practices in researcher assessment reform, highlighting existing initiatives and commitments.
- Ensure researchers understand the reasons for reforming researcher assessment.
- Ensure researchers, support staff, and policymakers are aware of the RAF and its implications, without overburdening them.
- Approve piloting or adoption of the RAF at the highest organisational level.
- Develop policies and procedures to implement, monitor, and evaluate the RAF, recognising that initial indicators may need adjustment.
- Develop or utilise tools and platforms to support RAF implementation, assessing whether existing infrastructure suffices.
- Ensure the RAF does not overburden researchers with additional tasks or reporting requirements.
- Revise organisational policies and procedures based on lessons learned, embedding the framework into broader policy.
- Engage policymakers to define and revise national policies on research assessment.
We identified four categories of specific interventions: policy, awareness, training, and resources.
Initially, we created a lengthy document linking interventions to every indicator, but pilot feedback indicated this was too burdensome. Instead, we now recommend policies and training that encompass all indicators at a higher level.
At the policy level, the process is to approve piloting or adoption of indicators and interventions at the highest level, define what will be piloted or adopted (as definitions vary between institutions), develop policies and procedures, and then monitor and evaluate.
Resource interventions concern the people and infrastructure required, financial input, platforms, repositories, and staff. Without adequate resources, implementation will not succeed.
Awareness interventions address whether people understand why these changes are happening, how to make them happen, and whom to approach for assistance. It is important that all relevant staff, not just researchers, are aware.
Training interventions ensure people know what to do, how to do it, and why. Training may be linked to frameworks such as ResearchComp or the Researcher Development Framework, and should focus on developing the necessary skills and competencies.
Gareth O’Neill (TGB) on work with pilot organisations
Our framework was tested by pilot organisations who did significant work. We had three research-performing organisations (Nova University of Lisbon, University of Cyprus, and University of Rijeka) and two national funding agencies: one in Romania UEFISCDI and the Research Council of Lithuania. Each created cohorts of researchers, selected indicators from the framework, and deployed them with their own interventions.
Within universities, implementation varied by institute. For example, at Nova, the framework was used in the Global Health and Tropical Medicine Research Centre and the Information Management Research Centre, which have different evaluation criteria and focuses. This demonstrates the need for flexibility. In Cyprus, it was piloted at the KIOS Research and Innovation Centre of Excellence; at Rijeka, at the Faculty of Law. The funding agencies applied it to specific programmes, which are complex and evaluated over many years. For example, one agency mandated data management plans and assessed how to implement these fairly and openly.
Work with Pilot Organisations through Pilot Action and Mutual Learning
Raquel Vega from the Young European Research Universities Network (YERUN) introduced the pilot action segment, framing it as an opportunity for mutual learning and practical experimentation. Vega, who serves as Policy and Project Officer at YERUN, emphasised the collaborative effort that characterised the OPUS pilots and the importance of context-specific solutions.

The pilot programme involved five organisations, three Research Performing Organisations (RPOs) and two Research Funding Organisations (RFOs), who tested the RAF through tailored action plans and co-created strategies for meaningful reform. Each pilot organisation brought unique perspectives and challenges to the implementation process.
Raquel Vega: Pilots have done all the work within this work package, working extremely hard
The main goal of WP4 was to design and implement action plans to pilot selected elements of the Research Assessment Framework (RAF), as well as the Open Science Career Assessment Matrix 2 (OSCAM2), through their accompanying interventions in five organisations. In addition, we organised mutual learning sessions to draw out results and lessons learned.
The main protagonists of WP4 are the pilot organisations themselves. They have done all the work within this work package, working extremely hard. We had three universities and two funding organisations participating as pilots.
Our work followed three main lines. First, we designed the action plans. Second, we implemented and coordinated these action plans over an 18-month period. Third, we collected lessons learned from mutual learning sessions held throughout these 18 months.
To begin with the design and implementation of the action plans, the pilots had to define, very early on, their cohorts and the units or funding programmes involved. Together with the leaders of Work Packages 2 and 3, we selected the indicators and interventions that best fit each pilot, both from a strategic and feasibility perspective. It is important to mention that, in Work Packages 2 and 3, we also carried out a baseline audit to determine the starting point for each pilot.
The full implementation period lasted 18 months. We began with the first version of the action plans, which we started implementing in January 2024. The pilots ran until June 2025. During the first nine months, we observed what was working and what was not, and we collected those learnings into a revised version of the action plans, which was ready by September 2024. This marked the end of the first stage of implementation.
In addition to the individual interventions chosen by each pilot, we also introduced some common interventions for all pilots. These focused on the links between open science and trust, open science and industry, and open science and gender equality.
Regarding mutual learning, the key process was the bi-monthly mutual learning sessions, which took place both online and in person. These sessions were essential for collecting challenges and recommendations from each pilot organisation regarding their implementation experiences.
From January 2024 until approximately September 2025, we collected all these learnings and, by the end of November 2025, produced the first version of the Mutual Learning collection of recommendations and challenges. In the following nine months, the second stage of implementation, we added further learnings from this later period. We are currently working on the final mutual learning collection of lessons learned.
We monitored progress through a process we called co-monitoring, which we conducted together with the pilots. During the first part of the implementation period, the first nine months, pilots submitted monthly reports. During the second part, reports were submitted bi-monthly. These reports were reviewed by our partner Jisc, led by our colleague Helen Clare. The reports were also instrumental for the mutual learning sessions, as they informed the guiding questions for the various sessions we held.
Executive Agency for Higher Education, Research, Development and Innovation Funding (UEFISCDI) – Pilot Action (Romania)
Alina Irimia, Open Science Projects Coordinator at the Executive Agency for Higher Education, Research, Development and Innovation Funding (UEFISCDI), presented Romania’s comprehensive approach to Open Science implementation. Irimia showcased UEFISCDI’s strategic alignment with Romania’s National Open Science Strategy, including the implementation of FAIR Data Management Plans.

We tested the OPUS Research Assessment Framework and selected one indicator to implement and test. Ultimately, this turned into an actual implementation, specifically around introducing Data Management Plans (DMPs) into a major national funding instrument in Romania.
The timing of the project was very fortunate, as it coincided with the need to implement a national strategy on open science. These are the first steps in putting that strategy into practice, which states that DMPs should be mandatory for all publicly funded research.
We tested this requirement within a major funding programme, which has a substantial budget of 320 million euros. This programme is being launched for the first time in Romania. The evaluation phase has just concluded, and we expect to fund approximately 2,024 projects. These projects will have a duration of four to five years, with the possibility of extension by up to two years.
The main interventions focused on implementing these new requirements and evaluation criteria within this funding instrument. We had to introduce these requirements into the information packages for the funding instrument, which raised several challenges. At the institutional level, we were not accustomed to such requirements, and our staff were not fully prepared to implement open science-related obligations.
We needed to raise awareness among our staff, management, and also the Ministry. Although the Ministry had approved the national strategy, it was necessary to inform them about how these requirements would affect funding streams. As a national funding agency, we are not fully autonomous and depend on ministerial decisions regarding funding, so it was essential to have their support for implementation.
There were both policy and resource-related interventions. For example, we needed to develop or adapt a DMP template. In the end, we used the Horizon Europe template and the Science Europe guidelines, adapting them to our context. We provided resources to support both researchers and staff, including guidelines, a blueprint for researchers, and a guideline, translated from the Science Europe guideline, for staff to use when reviewing DMPs. We also provided examples of open DMPs to support researchers, who, to be honest, are already overburdened by this new requirement. Considerable awareness-raising was needed.
We organised a national conference on research assessment and open science, held institutional awareness sessions, and arranged hands-on meetings with staff to discuss specific challenges related to these new requirements. In June, we also held a nationwide webinar for researchers on DMPs. This training session attracted many engaged participants, demonstrating both interest and a need for support.
The indicator we achieved in OPUS was the development of FAIR DMPs. However, regarding the DMPs actually produced in the successful projects, we will only have those results after the projects commence, as the evaluation has only just concluded.
The key outcome of this pilot is that we have moved from testing to full implementation within a major funding instrument. The awareness-raising efforts have been a significant success, leading to the extension of these requirements, and more broadly, open science requirements, to other funding instruments. We are now discussing making these practices and requirements standard for future funding programmes.
There were several challenges. Integrating these requirements for the first time led to changes in our internal processes. We needed to raise awareness and work hands-on to update processes, including IT updates and discussions about future IT needs. There was also a challenge in understanding the open science indicator itself. My institution is fortunate to have an open science team, but many funders do not have such expertise. We supported staff in understanding open science and coordinated with other departments. We also had to understand the funding streams and the impact of open science requirements on these streams.
Policy and process integration presented further challenges, particularly in securing decisions to implement the changes. There was cultural resistance from policymakers, stakeholders, and researchers. There is a lack of capacity and skills related to open science and FAIR DMPs, and we still need to work on building institutional capacity and providing support and training sessions for researchers.
Staff upskilling is essential, as is providing tools and training for researchers regarding DMP adoption. The availability of European-level guidelines and templates, such as those from Science Europe, was very helpful, and we simply adapted them to our context.
A major challenge was coordinating timelines between project requirements and the timelines of the funding streams in our national programme. Delays occurred that were beyond our control, and we had only 18 months to complete all this work. However, these efforts will continue beyond the project’s duration.
Research Council of Lithuania (RCL) Pilot Action
Julija Baniukevic from the Research Council of Lithuania (RCL) presented an exemplary community-driven approach to Open Science implementation. Dr Baniukevic, who serves as OPUS Project coordinator in Lithuania, has extensive experience in bionanotechnology research, policy advocacy, and European science cooperation.
The Lithuanian pilot implemented a comprehensive training programme over nine months, reaching 21 principal investigators from various research fields who subsequently trained 211 researchers in their teams. The programme achieved remarkable engagement levels, with 91% attending on-site training and 20 out of 21 trainers conducting sessions for their teams. Participants showed strong commitment, with 68.2% of research outputs made openly accessible and 82.6% of participants initiating open science dialogues in their institutions.
Julija Baniukevic (RCL): Why we achieved such good results in just nine months?
When we were reorganising our approach, we had only nine months to achieve our goals, so we had to consider carefully at which level to act. We decided to focus on principal investigators (PIs) and selected one call with 136 winners. All were invited to participate; 46 expressed interest and attended the introductory session in September. Out of these, 26 began the programme, 23 continued throughout the year, and 19 completed the full nine-month programme and received certificates.
I was pleased that six Lithuanian institutions participated: four universities and two research institutes. We covered almost all research fields, with psychology representing about 18% of the participants, which was particularly interesting, though other fields were also well represented.

Recognising that you cannot achieve anything alone, I formed a team with support from our vice-chair, Vaiva Prudokinia, and established an Advisory Board within the Research Council. I was joined by three colleagues, Justina, Agla, and Girinta, from different departments: the Research and Business Cooperation Unit, the Researcher Support and EU Investment Unit, and the Research and Higher Education Policy Analysis Unit. Having diverse perspectives was crucial, and without these colleagues, our achievements would not have been possible.
Let me turn to our results. Our researchers, along with others trained during the OPUS pilot, participated in a nine-month programme. We selected five indicators: two in education, two in research, and one in valorisation.
Starting with education, we identified a common misunderstanding among researchers, who often equate open science solely with open access. We wanted to broaden their understanding, so we began with courses on open science. Initially, I was unsure whether these would be well received, but 91% of participants attended the on-site training, which was a pleasant surprise and made me proud of our researchers.
Over two days, we covered many topics, which was a great success and helped us build stronger connections with participants. After the initial training, participants received certificates as trainers. For the next four months, these trainers delivered training sessions to their own teams and laboratories. In total, 21 PIs trained 211 researchers, sharing knowledge and experiences. Only one person did not deliver the training, as she had not completed the full programme.
This group of researchers was enthusiastic about sharing their knowledge. Three trainers even organised a full-day conference on open science, which attracted 70 participants. We also involved one trainer in the Open Data Forum, organised by my colleague Girinta, where she represented both the OPUS project and open science in panel discussions. At our final event in June, eight cohort members shared their experiences, initiatives, and perspectives on open science, including the challenges they encountered.
These activities were a significant success, but we also identified some challenges. For example, some researchers mentioned a lack of compensation, so we are considering mechanisms for micro-rewards. There is also a lack of institutional recognition for public engagement activities, which we hope will improve. Limited understanding of the open science concept was evident, but over the nine months of our ambassador programme, we saw that researchers are eager to develop new skills and adapt open science practices to their own systems.
We also observed that personnel changes can impact pilot implementation, which is something to consider in future projects.
Turning to research indicators, we monitored the number of openly available publications. We had many discussions about what constitutes open access, including whether embargoes should be counted. In total, the cohort produced 85 publications, of which 58 (68%) were openly available. Interestingly, about 42% of researchers published exclusively in open access, while others published more than half of their work openly, though some published less. This is an interesting result, even if our sample size is small.
The fifth indicator was openly available data management plans (DMPs). We prepared recommendations for researchers, which will be added to our website. In collaboration with the Ombudsperson and her team, especially Reda, we analysed what needs improvement and how to enhance our management plans. All ten researchers agreed to make their DMPs openly available, and by the end of July, these will be published on the Research Council’s website in a dedicated section.
However, we still face issues. For example, when researchers submit proposals to the Research Council, they are required to include a DMP, but there is currently no follow-up on quality and implementation. We are working on how to address this, especially now that researchers understand the value of DMPs.
Dedicated funding for open science tasks would be very beneficial. Some researchers still view open science as additional bureaucracy, and there can be a disconnect between open science and researchers’ daily work. We need to demonstrate the advantages and relevance of open science more clearly.
Recognition and rewards are important. At the start, researchers wanted to know what they would receive for their efforts. We awarded certificates for trainers and for completing the nine-month ambassador programme. We also wanted to give them a sign of their ambassador status, but internal bureaucracy has delayed this. Nevertheless, participants gained visibility, and they appreciated small gestures such as homemade cakes and personalised awards.
Reflecting on why we achieved such good results in just nine months, I believe it was due to a clear action plan, a strong team, relevant topics, and an engaged community, all supported by the RCL administration and leadership. We see that an open science culture is beginning to take shape in Lithuania. Researchers themselves are now engaging in constructive dialogue and helping to shape open science policy at the national level. Having RCL experts from different departments was a strategic and crucial step, and the OPUS community has become a key driver of change in open science within the Research Council and across Lithuania.
University of Rijeka Centre for Research and Innovation (UNIRI) Pilot Action
Nataša Jakominić Marot, Head of the University of Rijeka Centre for Research and Innovation (UNIRI), presented the Croatian pilot’s comprehensive approach to Open Science integration. With over two decades of experience in driving innovation and research within academia, Jakominić Marot coordinates teams to achieve impactful results whilst contributing to meaningful societal change.
The University of Rijeka pilot focused on implementing Open Science initiatives across research centres, emphasising practical applications in data management, software development, and publication practices. The university’s approach included developing an Open Science Guide, designating library staff as reference points, and selecting specific repositories for different research outputs. While achieving progress in areas like open access publications and software sharing, the pilot encountered challenges including delays in national policies and difficulty retaining early career researchers.
Nataša Jakominić Marot (UNIRI): We built on existing infrastructure, such as the newly established centres
At the University of Rijeka, we have open science experts at the university library, librarians and research managers, whom we consult when designing or implementing open science activities. These experts also contributed to the development of our open science policy. The university library has a dedicated unit, the Centre for Open Science and Information Data Management, which was crucial for OPUS and other initiatives. Slightly before the project, we established the Centre for Science Outreach, which played an important role in valorisation activities. We wanted to boost these activities and make them more central within the university.
Our pilot cohort comprised 17 early career researchers based at the Faculty of Law, focusing on social sciences. Although the pilot was designed for this specific group, all activities, except those provided specifically by the librarian at the Faculty of Law, were open to any early career or other stage researchers at the University of Rijeka. We saw no need to restrict participation, but when measuring success and tailoring activities, we focused on our project cohort.
The key actions we undertook were aligned with the four domains presented by Gareth and Emma: research, education, valorisation, and leadership. Our activities did not cover leadership. Some activities were relevant to more than one domain, and I will highlight what the three units, the Faculty of Law, the Science Outreach Centre, and the University Library Centre for Open Science, implemented. The circular symbol on our materials indicates activities that will be sustained in the future, as they have generated significant interest and will become regular university activities.

The Centre for Open Science at the University Library began organising Open Science Cafés. We have held six so far, with a seventh planned soon. These cafés address topics requested by early career researchers, and are delivered by our librarians or external experts. We decided to hold these events online, based on feedback from the cohort, to save travel time and allow greater participation. This was a change introduced during the project.
We also provide twice-yearly training on open science skills, which will continue in the future. These comprehensive sessions cover six modules on various aspects of open science and have received positive feedback. The University Library also created a targeted web page specifically for early career researchers, featuring information on training, calls for proposals, and opportunities relevant to open science. This resource has been highly valued by our cohort.
At the Faculty of Law, we have a dedicated librarian who is both a researcher and research manager. She complemented the university library’s training by providing individual sessions or consultations as needed. She delivered training on open science publication skills and, importantly, on repositories. Although we have national and institutional repositories, they can be difficult to navigate. She worked individually with early career researchers to train them in their use, saving them time and facilitating their work.
We originally planned for 65–70% of our cohort to be trained in repositories, but as the university issued a call for funding proposals that required repository use, all cohort members required this training. This demonstrates that when a skill becomes necessary, researchers prioritise it, and we achieved more than anticipated.
For valorisation, the Science Outreach Centre led two types of activities: awareness-raising for researchers and public speaking. We trained researchers on the importance of outreach and on using the project repository. We also provided training in outreach activities, recognising that researchers need to develop these skills. An expert delivered two outreach training sessions during the project.
We built on existing infrastructure, such as the newly established centres, to maximise their value for the university and to embed activities into long-term institutional goals. A key advantage was having a core team member, Sasha, who was formerly vice director. This ensured full support from university management, which is essential for success.
We also created a repository for outreach activities, though it was developed by non-experts due to limited funds. In future, we hope to involve more experts and allocate funds for such work. We used common tools such as YouTube and WordPress, as we could not afford dedicated IT support.
There were many challenges and lessons learned. For any organisation wishing to implement similar activities, I recommend careful consideration of these lessons. Collaboration with the three university centres was vital; without them, we would not have achieved our goals. We assigned partial working hours to library staff for OPUS and open science, but their main duties remained library work, making it difficult to dedicate sufficient time to the pilot. In future, it would be beneficial to have dedicated units and explicit time allocations in contracts, as well as support and incentives for staff. Enthusiasm is important, but it is not enough for sustained, long-term activities.
Open science is known in Croatia, but practices vary greatly between disciplines. Training is available, but not mandatory, so implementation often depends on the team leader or principal investigator. We aimed to change this by focusing on early career researchers, believing that bottom-up influence can be more effective than targeting senior staff.
From the outset, we involved all relevant stakeholders, including the cohort and centre staff, and designated a contact person in each unit. We held an initial meeting to explain roles and reporting, and maintained regular monthly contact, sometimes more frequently, to gather feedback and adjust as needed. This continuous risk mitigation strategy contributed to our success.
For early career researchers, we addressed the lack of tailored support by creating the Edu Doc web page, a one-stop shop for information, contacts, and resources. In terms of rewards, we could only offer certificates, but I strongly recommend more generous recognition in future projects. We hope to introduce more incentives, and our colleague Sasa has worked to influence national research assessment frameworks, advocating for the inclusion of both qualitative and quantitative measures and for open science. We are not yet certain of our impact, but we believe our voice is being heard.
Sustainability was a priority. The timing was ideal, as we had just joined CoARA and were developing our CoARA Action Plan and renewing our HR Strategy for Researchers, which required new analyses and action plans. We have also continued to apply for projects, including the CoARA Boost project with the University of Cyprus, and are now in the implementation phase.
In summary, even if activities are not mandated at the national level, institutions can take action. One should aim for evolution, not revolution, small steps can lead to meaningful change. Academic cultures are difficult to shift, but dialogue and communication are essential. Our activities have increased the university’s visibility nationally and internationally.
University Nova of Lisbon (UNL) – Pilot Action
Isabel L. Nunes, Vice-Rector of NOVA University Lisbon responsible for Strategic Planning and Quality Management, demonstrated how the RAF guides institutional policy design and provides concrete support to researchers.

NOVA University Lisbon’s implementation focused on translating the RAF into practical institutional policies and procedures. The university’s approach emphasised providing concrete support to researchers whilst ensuring alignment with European initiatives and national strategies.
Creating a comprehensive open science strategy for the University will be essential to ensure the sustainability of these practices, which was one of our main goals.
Isabel L. Nunes, NOVA University Lisbon: We tested and adapted open science practices in different research contexts
To meet our objectives, we implemented our pilot across two distinct research centres, as already mentioned by Gareth: the Global Health and Tropical Medicine Research Centre and the Information Management Research Centre. These centres have different disciplinary profiles, ensuring diversity in needs and perspectives.
Our cohort consisted of fifteen researchers, mostly early career, from these two research centres. The aim was to test and adapt open science practices in different research contexts, promoting broad adoption across the institution after the project’s conclusion. The pilot focused on four main categories: data, software, publications, and engagement.
The interventions related to policy development, human resource allocation, repository selection, awareness raising, and training workshops.
Some of the key actions we implemented over the eighteen months of the pilot included the following. In the absence of a national open science policy on data, software, and citizen science, NOVA developed and launched the NOVA Open Science Guide to support researchers and ensure alignment with EU standards. This was coordinated by my colleague, Isabel Hoch, Vice Rector for Research and Innovation.
Regarding human resources, library staff were designated as open science reference points, guiding researchers and delivering training and awareness sessions. This demonstrated that internal staff can effectively support open science practices.
NOVA also defined specific repositories for different research outputs: Zenodo for depositing research data, GitHub for managing and sharing software, and RUN, NOVA’s institutional repository, for open access publications, which is integrated with the national repository portal. PURE, our CRIS system, is used for internal monitoring so we can track open science research outputs.
To promote awareness and training, we organised awareness sessions focused on institutional expectations and open science values, as well as seven practical workshops on topics such as open science publication, data management, software sharing, citizen science, and the use of PURE.
We monitored several open science indicators through the OPUS pilot, comparing the starting point in January 2024 with the results achieved by June 2025. Regarding openly available data sets, the Global Health and Tropical Medicine Research Centre began with none and reached the target of three data sets. The Information Management Research Centre also started with none and managed to publish one data set, which, although short of the target of three, reflects their more software-driven focus.
For software outputs, the Information Management Research Centre achieved their full target of three software sets, demonstrating strong engagement with FAIR and open source practices. In terms of open access publications, both research centres started with 68% of their publications openly available, and the cohort successfully achieved one hundred open access publications.
Regarding citizen science, no new citizen science projects were launched during the project, but we did produce three materials and conduct eight citizen engagement activities. This shows partial progress and highlights the challenge of launching fully fledged citizen science projects within a short timeframe, particularly with limited resources and early career researchers. Overall, these results demonstrate NOVA’s progress in key open science areas, especially open access and software, and provide insight into where further institutional support and alignment are needed.
During this journey, we faced several challenges, which provided valuable insights. The first challenge was the delay in national policies for open science. To address this, NOVA produced an internal open science guide, which was essential in supporting researchers in the absence of a national framework.
Another challenge was the lack of dedicated funding and human resources for open science implementation. Library staff served as our open science reference points and did an excellent job, but more support is needed. Retaining early career researchers was also difficult due to contract instability; we learned that sustainable participation requires more stable career pathways and greater institutional support for early career researchers.
Limited awareness of institutional practices among researchers was another challenge. We believe that introducing awareness campaigns is key to embedding open science as a shared institutional value.
In the second phase of implementation, around the ninth month, we experienced low engagement in training due to academic overload, as it coincided with student evaluations and project grant applications. We learned that training should be aligned with the academic calendar and should also offer asynchronous options, such as short videos available online at any time.
Across our cohort, we identified inconsistent use of repositories. Standardising repositories, such as Zenodo and GitHub, would improve monitoring and visibility of open science practices. The absence of a clear repository for citizen science activities was another challenge, indicating that repository solutions must be planned early and supported by policy and infrastructure guidance.
At NOVA, we chose to reward researchers participating in the pilot by considering open science practices in their career progression. However, open science contributions are not yet fully integrated into researchers’ career evaluations. To overcome this, we believe that including open science indicators in the evaluation framework could promote the adoption of open science practices.
University of Cyprus (UCY) – Pilot Action
Panagiotis Moiras, MSc, BA (Hons), MCIM, Senior Research and Project Management Specialist at the University of Cyprus (UCY), presented the Cyprus pilot’s focus on integrating the RAF into technical practices. Moiras, who serves as Director of the Research and Innovation Support Service (RISS), brings more than 15 years of professional experience in project management and communications consultation.

The Cyprus pilot, centred on the KIOS Centre of Excellence, focused on integrating the RAF into practices around publication drafting, data management, and software development. The university played a central role in promoting Open Science principles both institutionally and nationally, having become the first institution in Cyprus to sign the Berlin Declaration on Open Access to Knowledge in the Sciences and Humanities in 2008. The pilot’s achievements included instrumental roles in shaping national policy, leading to the approval of Cyprus’s open science frameworks in 2016 and 2022.
Panagiotis Moiras, UCY: Theoretical reforms were put into practice, tested and refined.
The pilots were the operational backbone of the OPUS project, where theoretical reforms were put into practice, tested and refined. Their success was critical to ensuring that the broader goals of aligning research assessment are met.
This succinctly reflects our journey at the University of Cyprus over the past two years as one of the designated pilot institutions. As a mid-sized, research-intensive university, the University of Cyprus faced both the challenge and the opportunity of translating aspirational reform into functional change.
Our efforts were implemented through the KIOS Research and Innovation Centre of Excellence, the largest information and communication technology research centre in Cyprus.
Turning to the core of our pilot actions, we undertook a multidimensional pilot as part of OPUS. The pilot was not only an opportunity to contribute to European research assessment reform but also to test internally how open science principles could be gradually institutionalised in a research-intensive environment. Our focus was on developing a contextualised, evidence-based, and action-oriented plan aligned with the University’s broader research strategy and the objectives of OPUS.
At the heart of the University of Cyprus pilot was a defined cohort of twelve early-career KIOS researchers. The steps we followed illustrate a structured and collaborative approach to preparing for implementation, ensuring that key roles, criteria, and internal consensus were established early.
The ambition was to create a microcosm of institutional change. The pilot was based on the OPUS Research Assessment Framework. Of the four dimensions, our pilot involved the research category, and included actions relevant to all five assessment categories: policy, resources, repository infrastructure, awareness raising, and training. Specifically, we focused on the subcategories of data, software, and publications, and their corresponding indicator groups.
Policy interventions began with a kickoff meeting of the KIOS Open Science Committee to establish a shared vision and commitment. A significant milestone was the identification and commitment of the twelve early-career researchers forming our core pilot group. We reviewed and identified existing institutional procedures that required amendment to better align with open science principles and to facilitate the pilot’s implementation. To reinforce institutional backing, we held a key meeting with the University’s leadership team, securing their support for the pilot.
An important initiative was the adaptation of the Personal Career Development Plan (PCDP) forms to include explicit references to open science practices. In parallel, the annual Open Science Award at KIOS was restructured to better reflect meaningful open science contributions by incorporating open science indicators.
Resource interventions included the introduction of a new, voluntary role: the OPUS, now called Open Science Champions. We trained two early-career researchers to act as focal points and ambassadors for open science within and beyond the organisation, providing guidance and support. We also assessed and allocated the necessary resources—time, tools, and institutional support. The Open Science Champions underwent a specifically tailored training programme.
Recognising the limitations of available resources, our action plan emphasised strategic planning and ad hoc resource allocation. This included identifying additional funding sources, submitting synergistic proposals, and developing low-cost, high-impact initiatives as proof-of-concept interventions.
Repository interventions focused on building bridges between existing structures. We established a communication channel between our institutional repository, GNOSIS, and those already used by KIOS. Specific actions aimed to improve the visibility and usability of current repositories and to align metadata practices with FAIR principles.
Awareness-raising interventions included several activities to foster broader institutional engagement. We launched an introductory seminar on open science to kick off the pilot, followed by targeted communications such as newsletters and social media campaigns to make open science principles more visible and accessible. We participated in thematic events, such as Open Science Week and Open Science Data Day, which also helped demystify open science for non-research staff and administrative stakeholders.
Training and capacity building involved a comprehensive set of activities: an introductory seminar, targeted training on tools and repositories, and a workshop for the Open Science Champions. This “train the trainer” intervention focused on strengthening communication and soft skills through interactive role-play scenarios. The training prepared participants to navigate diverse perspectives within the academic community. Our final training focused on developing an open science narrative CV. We also encouraged participation in other relevant training organised by stakeholders in the ecosystem, such as the UFA Open Science programme and online seminars and workshops held from November 2024 to January 2025.
Metrics and Outcomes
Marius Kyriakou, Early Career Researcher, KIOS Centre of Excellence
Thank you for the opportunity to join this project and present our results. For the publication drafting indicator group, we set initial targets for months 6, 12, and 18, and achieved our goals. For the data management indicator, we met the target for the number of downloads and came close to targets for other metrics. For the software development indicator, we greatly exceeded two of the four metrics. Researchers became more familiar with open science tools, and we saw more active engagement in open practices. Overall, this was a valuable experience for us at the KIOS Centre of Excellence.
Challenges and Lessons Learned
As with any reform effort embedded in a working institution, we encountered a range of technical, cultural, and structural barriers. Resistance to policy change was one of the most notable challenges. While many approached the interventions with interest, others raised concerns. It became clear that, beyond formal policies, real progress requires open dialogue, relatable examples, and a foundation of trust. Reform is most successful when it involves the academic community from the outset; bottom-up participation and recognition of researchers’ concerns can greatly smooth transitions.
Our pilot highlighted the need for more structured mechanisms to collect and monitor relevant indicators. While there is increasing emphasis on metrics, existing infrastructure and internal policies do not yet fully support systematic tracking. The lesson learned is that metrics are powerful tools, but only if supported by clear policies, trained personnel, and digital systems that make their collection and interpretation routine and meaningful.
As with many pioneering efforts, our pilot unfolded in a landscape of evolving capacity and support. For open science to move beyond rhetoric, institutions must strategically plan for human and financial resources, including lobbying for national-level support and seeking synergistic projects.
Main Achievements
Despite these challenges, our pilot at the University of Cyprus produced a substantive set of achievements:
- Establishing a governance model: The KIOS open science function proved highly effective and can be replicated across departments and institutions.
- Institutional learning: We observed increased institutional literacy around open science within our academic community.
- Changes in research evaluation: By embedding open science criteria into PCDP forms and reinforcing open science with specific tangible and intangible awards, we fostered a process that rewards transparency and collaboration.
- Progress towards CoARA alignment: Our pilot positioned the University to align with CoARA principles, including a shift from proxy metrics to value-based assessments.
- Building momentum for cultural change: With the Open Science Champions leading the initiative, we created a new cohort of early-career researchers who are now both participants and advocates of open science. Their increased visibility, leadership, and reputational standing serve as powerful motivators and drivers of cultural change.
Reflecting on the OPUS pilot at the University of Cyprus, it was not just about testing tools or methodologies, but about surfacing assumptions and revealing where institutional inertia lies and where innovation can thrive.
Looking ahead, we have identified five milestones for our ongoing journey:
- Institutionalisation: Embed successful interventions into permanent structures.
- National engagement: Engage with national agencies and funders to align reward systems.
- Broaden participation: Scale these practices to other departments within the university.
- Invest in capacity: Improve infrastructure and develop tools, and assess human capital requirements.
- Learning from others: Continue engaging with other OPUS pilots, building on synergies and mutual learning.
At the University of Cyprus, we are proud to have contributed to this transformation and to the OPUS project, and we look forward to the future with both pride and anticipation.
Beyond the Project Pilots: PLOCAN Action
Corina Moya-Falcón also presented PLOCAN’s approach that extended beyond the standard pilot framework, embedding Open Science principles in marine data management and software development for cutting-edge marine research.

As project coordinator, Moya-Falcón demonstrated how PLOCAN’s action transcended typical pilot limitations by integrating Open Science into core research infrastructure and data management systems.
Corina Moya-Falcón, PLOCAN: We have included open science practices in our strategic plan
During the development of this project, PLOCAN, as both a partner and coordinator, has learned a great deal and has also applied several interventions. I will share some of these interventions and how we have implemented them.
For those unfamiliar with PLOCAN, it is a research infrastructure located in the Canary Islands, devoted to marine observation. We offer various services to universities and other research organisations and are involved in numerous R&D projects. Our team consists of around seventy people, mainly research managers and technicians, many of whom wear two hats as both researchers and research managers, as Rinska mentioned earlier.
PLOCAN actively incorporates open science principles into its organisation, with a particular focus on marine data management and software development for innovative marine research solutions. We have applied interventions at different levels: policy alignment, awareness and training, and repository interventions.
To highlight some actions in terms of policy intervention, we have included open science practices in our strategic plan, which is crucial and aligned with the senior management of PLOCAN. This alignment is essential for applying these principles, particularly regarding data management and quality. We have developed protocols for data management, as we work with a significant amount of sensor data that is valuable for researchers worldwide. It is important to provide this data in accordance with FAIR principles so it can be widely used.
In terms of technological infrastructure, we have trained our staff to use online clouds and platforms for uploading marine data, which is vital for our research. Regarding data usability and accessibility, we have developed interactive visualisation tools to facilitate data access and analysis.
On the innovation and community engagement front, we have developed open source software and shared it with the scientific community. We have also organised open science and training workshops for our staff during the execution of this project.
We have faced several challenges and learned important lessons. First, institutional commitment is a key driver of change. Continuous training for our team members is also essential. The development of interactive tools has improved accessibility to our marine data. We have learned a great deal by observing how other pilots have implemented interventions in their organisations, and we have realised that aligning with European initiatives is also very important for our institution.
Q&A Session
Maya Sundukova (Marie Curie Alumni Association):
How did you pilot organisations choose the indicators and interventions? Did you base your choices on your own audit or knowledge of your community? Each organisation has its own priorities and values, so how did you decide, and what would you do differently now?
Alina Irimia (Executive Agency for Higher Education, Research, Development and Innovation Funding, Romania):
As I mentioned earlier, the timing was fortunate because we had to implement a national strategy that included objectives such as FAIR data, open access, and promoting citizen science. We chose our indicators based on these priorities, following the principle of “as open as possible, as closed as necessary” for data. We wanted to align research data management with FAIR principles and provide support on what to open and what not to open. In Romania, we do not have a national open access mandate, so we are taking a step-by-step approach, starting with DMPs and gradually implementing the national strategy.
Julija Baniukevic (Research Council of Lithuania):
We based our choices on the renewed national policy on open science, which was updated last September. Our actions were aligned with the national open science agenda.
Gareth O’Neill (TGB):
It is important to remember the limitations of the project. OPUS lasted three years, with the first year devoted to developing the framework. The pilots then had to select indicators, decide on relevant interventions, and embed these in their institutions. They also had to recruit cohorts of researchers to participate voluntarily, as participation could not be mandated. All of this had to be implemented and monitored within about 18 months, so choices were inevitably shaped by what was feasible within the project’s structure. The more indicators and topics you select, the more infrastructure and support is required. Achieving this within the timeframe, even with existing infrastructure, was a significant challenge.
Reda Cimmperman (Research Council of Lithuania):
As Julija mentioned, changes in regulation and personnel meant that we sometimes had to change our indicators during the project. Despite the short timeframe, a great deal was accomplished. Thank you to everyone who contributed and shared expertise.
James Morris, Science Europe:
What strikes me across all the pilot presentations is the underlying emphasis on training, guidance, and community building. Can these actions be repurposed or adapted for future interventions? Will it be easier to undertake further actions towards open science based on your experience?
Nataša Jakominić Marot, University of Rijeka:
Theoretically, yes, but we need resources and institutional alignment. We would need to involve more units and dedicate more resources to scale up these actions.
Isabel Hoch, NOVA University:
We share the same perspective. As we await national guidelines and implement our open science policy, we believe these practices can be institutionalised across all research units.
Panagiotis Moiras, University of Cyprus:
Previously, open science was perhaps seen as a buzzword by many in our academic community. Through specific actions in the project, it became more visible and tangible. These interventions must be repurposed and replicated, as change is gradual and requires repeated efforts.
Alina Irimia, UEFISCDI Romania:
Our OPUS experience now extends to other funding programmes. Projects like OPUS provide evidence that supports implementation. For open science, we need this kind of evidence to show what works and to engage with other funders, exchanging expertise to impact the national level.
Lesson learnt on how to better support OS policies at institutional level: OPUS Policy Brief
Supporting Open Science at Institutional Level
Juliana Chaves Chaparro, Senior Consultant at UNESCO’s Management of Social Transformations Programme (MOST), presented the OPUS Policy Brief on lessons learned for supporting Open Science policies at institutional level. Dr Chaves Chaparro brings nearly 25 years of experience in science-policy-society interface promotion, particularly in the Global South, with an advanced degree in Environmental Sciences and ongoing PhD in Sociology.

Her presentation emphasised the importance of aligning policies across local, institutional, national, and European levels whilst integrating gender equality and diversity measures into Open Science initiatives. The policy brief concluded with recommendations for transforming barriers into policy actions, including developing coherent policies, securing resources, and supporting early career researchers and diversity.
Juliana Chaves Chaparro (UNESCO): Recommendations for transforming barriers into policy actions
The 2021 UNESCO Recommendation on Open Science extends the vision of equitable and accessible science. In this context, the OPUS Policy Brief focuses on how to transition to open science at the institutional level, drawing on lessons from five OPUS case studies. The brief addresses cultural, structural, and policy-related challenges, identifying key barriers and offering recommendations for local, institutional, national, and global levels.
Key Barriers and Challenges
- Fragmented Policies and Teams: Institutional change is often hindered by fragmented policies and lack of coordination between teams. Some pilots faced limitations due to underdeveloped national policies or restricted mandates from ministries.
- Resource Constraints: Limited resources, both financial and human, impede the effective implementation of open science.
- Resistance to Change: Both conscious and unconscious resistance persists, rooted in traditional assessment criteria and status quo biases.
- Precarity and Retention: High mobility, job insecurity, and low retention—especially among early career researchers and women—undermine long-term commitment to open science.
- Recognition and Incentives: Open science contributions are often undervalued in career progression, leading to increased unpaid workload and limited motivation.
- Technical and Capacity Gaps: Inconsistent technical expertise and limited capacity in social sciences and humanities restrict engagement with broader society.
- Cultural and Structural Inequities: Disparities in access to resources and support, along with gender and diversity gaps, persist across institutions.
Good Practices and Lessons from OPUS
- Participatory and Inclusive Processes: Early and active involvement of researchers, managers, and librarians in policy design and capacity building is crucial.
- Diversity and Representation: Efforts to include a mix of career stages, backgrounds, genders, and underrepresented groups enhance policy relevance and effectiveness.
- Leadership and Institutional Commitment: Visible support from top management, allocation of resources, and recognition of participants’ efforts are essential.
- Ambassadors and Champions: Initiatives such as open science ambassadors or champions, as seen at the University of Cyprus, foster community engagement and advocacy.
- Interim Policies: Where national frameworks are lacking, interim institutional policies help sustain open science practices and commitment.
- International Exchange: Cross-country collaboration and knowledge exchange motivate and reward participants, enriching the implementation process.
- Integration into Assessment: Embedding open science in research assessment frameworks, alongside capacity building and infrastructure, supports institutional change.
Remaining Challenges
- Partial Understanding of Open Science: Open science is often limited to open access and a few additional activities, rather than a holistic approach as recommended by UNESCO.
- Short Project Duration: The three-year timeframe of OPUS limited the scope for deeper transformation.
- Policy Coherence: Lack of alignment between national and institutional policies makes it difficult to sustain progress.
- Risk of Losing Momentum: Without coherent mandates and support, researchers may lose interest, jeopardising long-term change.
- Gender and Diversity Issues: Persistent gender bias, lack of disaggregated data, and insufficient support for caregivers and underrepresented groups remain significant barriers.
Policy Recommendations
- Develop and Align Policies: Ensure coherence across local, institutional, national, and European/global levels.
- Secure Resources: Allocate dedicated funding and staff to support open science implementation.
- Embed Open Science in Assessment: Integrate open science indicators into researcher evaluation and career progression frameworks.
- Build Capacity and Awareness: Invest in sustained training, infrastructure, and community engagement.
- Support Early Career Researchers and Diversity: Provide targeted support for early career researchers, caregivers, and underrepresented groups.
- Promote Inclusivity: Mainstream gender equality and diversity in open science strategies and practices.
- Centralise Infrastructure: Develop centralised, accessible infrastructure for data management, repositories, and monitoring.
- Foster International Collaboration: Encourage participation in open science working groups and international knowledge exchange.
Sustainable transformation towards open science requires comprehensive, coordinated action across all levels. Piecemeal or selective adoption of open science practices is insufficient for real change. Advancing together—integrating gender equality, diversity, and inclusivity—will foster greater trust in science and broader societal engagement. Institutions are encouraged to participate in open calls for action, contribute to national monitoring systems, and engage with ongoing working groups on financing, incentives, policy, and strategy.
Research Assessment and Open Science: Reflections from the OPUS Panel
The Tension at the Heart of Research Assessment
The OPUS Final Conference’s panel on “Research Assessment and Open Science” brought together a formidable line-up of voices, policy architects, funders, academics, and early career advocates, to tackle the persistent disconnect between how research is assessed and how it ought to be done in the 21st century. With Open Science principles gaining traction in policy, the reality remains fraught with resistance, inertia, and a striking misalignment between ideals and incentives.
Misalignment at the Core: Ana Persic’s Perspective
Ana Persic (UNESCO) opened with a candid diagnosis: the current incentive system is fundamentally out of sync with Open Science. Drawing from her experience with the 2021 UNESCO Recommendation on Open Science, Persic highlighted that while the values of transparency, equity, inclusiveness, and collective benefit are now enshrined in global policy, the rewards and awards systems lag behind. For many institutions, operationalising these values means overturning deeply entrenched practices—a daunting prospect.
She pointed out that while Europe, under the CoARA initiative, is making strides, many regions have yet to start this conversation. The Recommendation itself is the product of years of dialogue with thousands of stakeholders, underscoring the universality of these values, but also the scale of the challenge in putting them into practice.
Integration, Not Isolation: James Morris on the Funders’ View
James Morris (Science Europe) offered a pragmatic take from the perspective of research funders. For Morris, the key challenge is integration, Open Science cannot be bolted on as an afterthought. It must be woven into career progression, project assessment, and the broader fabric of research culture.
He shared findings from a recent survey of Science Europe’s members: while open access and data sharing are well-supported, newer elements like open peer review and citizen science remain largely aspirational, appearing in strategy documents but lacking operational support. Morris illustrated the difference between compliance and genuine culture change with a simple question: if a researcher continues to write data management plans after moving to an institution where they are not required, has the culture changed? If not, the system is still stuck in a compliance mindset.
Equity and Access: Louise Bezuidenhout’s Challenge
Louise Bezuidenhout, representing Leiden University and the UNESCO Chair on Diversity and Inclusion in Global Science, shifted the focus to equity. She argued that openness is not synonymous with accessibility or fairness. Years of research have shown that current research assessment infrastructures often exclude scholars from low and middle-income countries due to language barriers, digital divides, and irrelevant metrics.
Bezuidenhout dissected the UNESCO Recommendation’s four core values, noting that only transparency aligns with traditional academic norms. The others, equity, collective benefit, inclusiveness—demand a shift from individual achievement to collective responsibility. This cultural transformation, she argued, is at the heart of the difficulty in reforming research assessment.
Quality and Societal Relevance: René von Schomberg’s Systemic Critique
René von Schomberg (RWTH Aachen University) delivered a robust critique of current assessment systems. He argued that the prevailing reward structures undermine both the quality of science and its relevance to society. Citing a former Nature editor’s claim that 60% of articles in the journal have irreproducible data, he illustrated the consequences of prioritising competition and output over reproducibility and societal value.
Von Schomberg was particularly critical of the gold open access model, which he said incentivises quantity over quality and has fuelled the rise of predatory publishing. His solution? Rethink funding structures entirely, shift from lump sum allocations to mission-based funding, and assess researchers by their contribution to societal missions, not just publications or citations.
The Voice of Early Career Researchers: Pil Maria Saugmann
Pil Maria Saugmann (Eurodoc) spoke passionately about the realities faced by early and mid-career researchers. She reminded the audience that Europe’s research and higher education system has a democratic mission, yet the changes OPUS has achieved in pilot institutions remain fragile, vulnerable to shifts in leadership and policy.
Saugmann called for deep structural change, not just policy tweaks. She emphasised the need for reform to be visible in job calls, hiring, and funding processes, the very mechanisms that shape academic careers. Precarity remains a central issue, as most assessment happens at the early and mid-career stage, where the stakes are highest. She also highlighted the persistent exclusion of early career researchers from academic governance, despite significant changes in the researcher landscape over the past quarter-century.
Trust, Economics, and the Cost of Inertia
Other voices in the discussion, such as Pierre Winicki from TrustInside and Sasa Zelenika from the University of Rijeka, broadened the debate. Winicki introduced the concept of a “tree of trust,” arguing that much resistance to Open Science is rooted in a culture of distrust, manifested in competition over collaboration and a lack of shared rules. He challenged the community to quantify the cost of distrust in science, suggesting that only then might leaders be compelled to act.


Zelenika underscored the economic inertia in the system, pointing out that major publishers generate billions in revenue while researchers do the work. He was sceptical that short-term, project-based reforms could deliver lasting change, arguing that meaningful transformation requires sustained investment and commitment.
A Call for Structural Courage
The panel concluded with a clear message: The current research assessment system is not just outdated, but an active barrier to Open Science. The challenge is not a lack of ideas or values, but a lack of structural alignment, inclusive governance, and institutional courage.
Systemic problems, exclusionary infrastructures, entrenched metrics, misaligned incentives, cannot be solved by simply adding Open Science to policy checklists. True progress will require coordinated, structural transformation: new ways of funding, assessing, and recognising research.
The OPUS project has proven that change is possible, but as the panellists reminded us, pilots and principles are not enough. What’s needed now is leadership, trust, and the collective will of the research community to stop waiting for permission and start shaping the future.
See the full report from the panel session here.
The OPUS Project: A Journey to Transform Research Assessment
After a day filled with presentations, panel discussions, and collaborative dialogue, three distinguished speakers took to the stage to reflect on what had been accomplished and what lay ahead for the future of research assessment.
A Foundation Built on Collaboration
Lamine Diagne, Senior Programme Officer at UNESCO’s Management of Social Transformation (MOST) Section, opened the closing remarks with a sense of both celebration and purpose. Standing before an audience of researchers, policymakers, and innovators from across the globe, he painted a picture of transformation that extended far beyond the conference room.
“Today we are not only celebrating the impressive outcomes of the OPUS project, but also reaffirming our shared commitment to advancing more open and responsible research and innovation,” Diagne declared, his words carrying the weight of years of collaborative effort.
The journey, he explained, had begun in 2017 with the RRING group projects, creating a foundation upon which OPUS had built something remarkable. This wasn’t simply about European achievement—it was about creating a global resource that would ripple outward, touching research institutions and policymakers worldwide.
What made Diagne’s message particularly compelling was his vision of practical impact. “The insights and deliverables from OPUS are not ends in themselves, rather they are stepping stones towards lasting change,” he emphasized, underlining how the project’s frameworks and tools would serve as invaluable resources for policymakers, research organisations, and funders well beyond Europe’s borders.
The Promise of Global Reach
The true power of OPUS, according to Diagne, lay not in its technical achievements alone, but in its potential to transform how science serves society. UNESCO’s commitment to disseminating the project’s results among delegations, partners, and policymakers, including at the Executive Board level, represented a commitment to scaling impact across continents.
“We aim to foster more pluralistic approaches, create new incentives for researchers worldwide and invent more societally inclusive scientific practices,” Diagne explained, connecting the project’s work to UNESCO’s broader mission of ensuring science serves all people, especially the most vulnerable groups in society.
His words carried particular resonance as he linked OPUS to UNESCO’s foundational recommendations on science and artificial intelligence ethics, demonstrating how the project aligned with broader efforts to build public trust in science and promote gender equality in research.
Bridging Policy and Practice
Rinske Van Den Berg, REA Project Policy Officer, brought a practitioner’s perspective to the closing session. Her brief but pointed remarks highlighted the practical challenges of translating conference insights into policy action.
“I’ve learned that dialogue and the connection between Member States and non-member States, or, let’s say, the whole world from UNESCO’s perspective, remains important,” Van Den Berg observed, acknowledging the complex web of relationships that must be navigated to achieve meaningful change.
Her emphasis on taking feedback “on paper to Brussels” for upcoming reviews with partners reflected the meticulous work of policy implementation—the unglamorous but essential task of ensuring that innovative ideas find their way into institutional frameworks and funding mechanisms.
Celebrating Achievement, Embracing Responsibility
Corina Moya-Falcón, OPUS Coordinator at PLOCAN, brought the day full circle with her comprehensive reflection on what had been accomplished. Her words painted a picture of rich dialogue and substantive progress—presentations of research assessment frameworks, experiences from pilot organisations, and panel discussions that had addressed both challenges and opportunities in research assessment within the context of open science.
“Your commitment and dedication have been fundamental to the success of this event,” Moya-Falcón told the assembled participants, her gratitude extending beyond mere politeness to recognition of the collaborative spirit that had made the project possible.
But perhaps her most important contribution was her acknowledgment of what the day represented: not an ending, but a beginning. “Thank you all for your participation and your commitment to transforming research assessment and open science,” she concluded, framing the work ahead as a shared responsibility.
The Road Ahead: Transforming European Research Assessment
The OPUS Final Conference represented more than a project conclusion; it marked the beginning of a new era in research assessment that prioritises Open Science practices, equity, and societal impact. The comprehensive framework developed through the project, tested across diverse institutional contexts, and refined through extensive stakeholder engagement provides a robust foundation for continued reform.
The conference highlighted how the OPUS Researcher Assessment Framework offers a practical, scalable model that helps reshape how we assess, support, and reward researchers in the 21st century. By addressing both research and non-research activities, combining quantitative and qualitative approaches, and providing flexibility for institutional adaptation, the RAF represents a significant advancement in research evaluation methodology.
As participants reflected on the journey from traditional metrics to comprehensive Open Science assessment, the conference demonstrated that meaningful reform requires sustained collaboration between researchers, policymakers, institutions, and funding organisations. The OPUS project’s legacy lies not only in the tools and frameworks developed but in the community of practice established to continue advancing Open Science principles across European research ecosystems.
The Paris conference thus served as both celebration of achievements and launching point for the next phase of research assessment reform, ensuring that Open Science becomes not merely an aspiration but the foundation of how research excellence is recognised and rewarded across Europe and beyond.
- Posted In:
- OPUS News

