Research data management is now present in many scientific institutions in Germany. This is done in a variety of ways, but most often through the establishment of a central point of contact, e.g. RDM support. Of course, there are also different data services, e.g. through the institution’s own data centre or the local IT department. What happens next?
Evaluation not Evolution
Instead of an evolutionary development in different steps and iterations, there is also the possibility of taking a strategic look at research data management in one’s own institution through an RDM evaluation. There are now a number of different models for looking at research data management. However, for whom is such an approach appropriate?
An assessment of an institution’s RDM is particularly useful when an institution wants to look at the different domains, applications and service areas as a whole. By taking a holistic view of the different offerings, it also becomes clear which aspects have not yet been considered, or have been considered only to a limited extent. Such an RDM assessment is also used by institutions to develop their own research data policies. In this way, knowledge of the status quo, the development of future perspectives and the setting of normative frameworks are combined in a single process. Such a process has many advantages for understanding the status quo and for strategic development.
Of course, there are disadvantages to such an evaluation. First, it is a linear process involving many actors and institutions. This can involve a lot of resources. The process simply takes time. It is also important to bear in mind that existing rivalries or disputes over competences are exacerbated by such a process. At the same time, it leads to a formalisation of knowledge and structures. Especially in the dynamic environment of research data and software, it can also be an advantage if structures remain flexible in order to be able to react quickly to new developments and needs. A long-term evaluation process that leads to a stiffening of process elements can be more of a hindrance.
There are therefore advantages and disadvantages for an internal RDM evaluation. The decision on such a procedure cannot therefore be made in a blanket manner, but must always be discussed in a case-by-case decision and with the colleagues concerned on site. We as RDM Support of the Max Planck Digital Library are happy to support you in this, but we cannot make the decision for you. However, they can benefit from MPG-internal examples. One example within the Max Planck Society for an RDM evaluation is the Max Planck Institute for the History of Science. There the RDM evaluation was implemented to improve their own data services.((Steffen Hennicke, ‘Strategieentwicklung für Institutionelles Forschungsdatenmanagement – Erfahrungsbericht zum Einsatz von RISE-DE’, 5. FDM-Workshop for the Max Planck Society, 2022, https://hdl.handle.net/21.11116/0000-000B-2B52-9.))
Which RDM evaluation models are available?
There are different types of evaluation models. They have different objectives, so it makes sense to choose the appropriate approach for your own objectives. Three different models are discussed below.
RISE-DE
The RISE-DE model((Niklas K. Hartmann, Jacob Boris, and Nadin Weiß, ‘RISE-DE – Referenzmodell für Strategieprozesse im institutionellen Forschungsdatenmanagement’, 2019, https://doi.org/10.5281/zenodo.2549343.)) is an adaptation of RISE v1.1 to the German science system. The model emphasises participatory processes. It is therefore particularly suitable when the RDM evaluation is part of a journey towards institutional policy. The model is very helpful in providing an overview of the status quo and in identifying open issues. These IST/SOLL analyses support the involvement of actors and stakeholders in the RDM process. Finally, the model provides many handouts and materials for developing and establishing RDM in one’s own institution.
UpDateFDM
Recently, the Computer and Media Service (CMS) of Humboldt University Berlin published a maturity model for evaluating institutional RDM.((AnnaLehmann, Malte Dreyer, Carolin Odebrecht, and Kerstin Helbig, ‘UpdateFDM – Evaluierung von Forschungsdatenservices und -infrastrukturen’, in: b.i.t. online 26 (4), 2023, pp. 332–41, https://www.b-i-t-online.de/heft/2023-04-fachbeitrag-lehmann.pdf.)) The model aims specifically at service-providing higher education institutions in German-speaking countries. It understands institutional RDM as the sum of various processes and consistently meets the requirements of IT Service Management. The evaluation model shows how research data management services can be selected and evaluated by service providers such as computing and media centres or university libraries.
Diamant
The diamond model was developed at the University of Trier.((Lea Gerhards, Marina Lemaire, Stefan Kellendonk, and André Förster, ‘Das DIAMANT-Modell 2.0: Modellierung des FDM-Referenzprozesses und Empfehlungen für die Implementierung einer Institutionellen FDM-Servicelandschaft’, 2020, https://doi.org/10.25353/UBTR-XXXX-F5D2-FFFB. See as well Marina Lemaire, ‘Wie komme ich zu einer Forschungsdatenmanagement-Strategie? Eine Antwort gibt das DIAMANT-Modell’, 2021, https://doi.org/10.5281/zenodo.5498068.)) Similar to RISE-DE, the DIAMANT model consists of an RDM reference process. This process provides for the establishment of a central RDM control unit – e.g. a central RDM support – which controls the information flow between all RDM actors. The current status and future goals can also be derived from an RDM competence matrix.
Conclusion
As has been shown, there are a number of ways in which one can evaluate one’s own RDM in an institution there are both pro and contra arguments to such a process. It is not always helpful, but it can often be a useful support. This simply needs to be considered on a case-by-case basis.
We as MPDL RDM Support are happy to support you as Max Planck colleagues. Please do not hesitate to contact us via rdm [at] mpdl [dot] mpg [dot] de