Main content

Home

Menu

Loading wiki pages...

View
Wiki Version:
“The use of journal impact factors and other metric indicators of research productivity, such as the h-index, has been heavily criticised for being invalid for the assessment of individual researchers and for fueling a detrimental “publish or perish” culture. Multiple initiatives call for developing alternatives to existing metrics that better reflect quality (instead of quantity) in research assessment.” (https://doi.org/10.23668/psycharchives.8162) A number of themes run through the recommendations of initiatives such as The Declaration on Research Assessment (DORA, https://sfdora.org/) and the Coalition for Advancing Research Assessment (CoARA, https://coara.eu/): • the need to eliminate the use of journal-based metrics, such as Journal Impact Factors, in funding, appointment, and promotion considerations; • the need to assess research on its own merits rather than on the basis of the journal in which the research is published; and • the need to recognise the diverse outputs, practices, and activities that maximise the quality and impact of research As of 12 January 2023, 441 organisations (including European Commission, League of European Research Universities, European University Association, Science Europe, Deutsche Forschungsgemeinschaft (DFG) e.V., Deutsche Gesellschaft für Psychologie) have signed the Agreement on Reforming Research Assessment. The signatories agree to define an action plan within the year. With these developments, it is evident that profound changes will come in how academic achievements will be evaluated, both in hiring and in funding decisions. But how can we practically implement and evaluate a reformed research assessment? What will be the consequences for researchers from signing institutions or applying for signatory funding agencies? How to ensure a smooth transition for early-career researchers? And how can we reconcile diverse needs across disciplines? To address these questions, the LMU Open Science Center organised a symposium, with short presentations followed by a panel discussion. • **Prof Dr Toma Susi (CoARA Steering Board Member, University of Vienna)** presented the Agreement and the Coalition for Advancing Research Assessment (https://coara.eu/agreement/the-agreement-full-text/). • **Prof Dr Kristin Mitte (Vice President Research & Development, Ernst-Abbe-Hochschule Jena)** would have shared her views on how societal impact and translation of research constitute important evaluative dimensions in the applied sciences - but she had to cancel her participation. • **Dr Tobias Grimm (Head of Division Life Sciences II, German Research Foundation (DFG) head office)** presented the DFG’s approach to reforming research assessment. • Dr Jess Rohmann (Scientific Strategic Advisor in the Scientific Director's Office at the Max Delbrück Center in Berlin) presented the perspective of Early-Career Researchers and an overview of recent actions taken in selected Berlin Institutions (Charité/BIH, Max Delbrück Center). • **Prof Dr Felix Schönbrodt (Managing Director of the LMU Open Science Center)** presented the Deutsche Gesellschaft für Psychologie’s working group proposal for practical implementation, for the field of psychology in Germany (see https://doi.org/10.23668/psycharchives.8162 and https://doi.org/10.31234/osf.io/5yexm). The event was chaired by **Dr Heidi Seibold**, Open and Reproducible Data Science Consultant, and Associate Member of the LMU Open Science Center.
OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.