Home

Menu

Loading wiki pages...

View
Wiki Version:
<h2>Promoting Transparency Practices and Diminishing Journal Impact Factor</h2> <hr> <p>Co-hosts: Health Research Alliance, Laura and John Arnold Foundation, and National Academy of Sciences Funded by the National Institute of Aging</p> <p><strong>Overview</strong></p> <p>The purpose of this meeting is to define a strategy and product to meet two seemingly distinct goals: 1) Promote publishing and grant-making policy adoption to increase transparency, openness, and reproducibility and 2) Reduce the dysfunctional influence of the journal impact factor on assessment of quality of journals and individual articles.</p> <p>These two goals will be connected by developing and promoting alternate metrics for evaluating journals and funders based on the quality of their policies and practices rather than citation counts of the research they publish or fund. These Transparency Factors will assess journal and funder policies relating to good practices in openness and reproducibility such as those recently adopted by general outlets like Science and Nature, and high-profile disciplinary outlets such as Psychological Science. This meeting is a follow-up to the Transparency and Openness Promotion meeting held in November 2014 that resulted in the TOP Guidelines (Nosek et al., 2015, Science), and aligns with other prominent initiatives such as the DORA statement to reduce emphasis on impact factor. </p> <p>TOP defines 8 standards that journals, funders, and institutions can adopt to encourage or require open and reproducible research practices. Since publication in 2015, TOP has garnered about 3,000 journal signatories and dozens of organizations. DORA provides specific recommendations to reduce the influence of impact factor in hiring, promotion, and funding decisions. Created in 2012, it has 859 organizational signatories and has prompted, for example, many journals to remove mention of impact factor in its advertising and promotional material.</p> <p>These and related initiatives have been successful in generating awareness about dysfunctional incentives that might reduce the credibility of the published literature. There is strong community support for nudging the research culture toward incentives that align scholarly values with scholarly practices. And, individual journals and funders have taken steps to implement new policies and practices that could have this positive impact if brought to scale.</p> <p>Nevertheless, the barriers to cultural change are formidable, particularly inertia and the comfortable embrace of the status quo, and the challenge of the decentralization of science. No one journal, funder, or institution can change the cultural incentives. The coordination problem requires providing all change agents with the means, motive, and opportunity to do their action steps for promoting openness and reproducibility.</p> <p>The purpose of this meeting is to build on these existing efforts and provide that stimulating agent to action.</p> <p><strong>Meeting Agenda</strong></p> <p>Despite being derided by many (most?) researchers and other stakeholders, the Journal Impact Factor retains significant influence in driving behavior of researchers, journal editors, funders, and administrators. That is because it seems easy to comprehend, it is universally applicable, conveys a heuristic sense of quality or value, and there are no comparable alternatives. </p> <p>It is easy to decry heuristics in general and citation impact in particular. Heuristics convey little information and are prone to error. But, heuristics are also incredibly efficient and provide instant, first pass insight during information search. More pointedly--humans simply cannot ignore heuristics, particularly in complex ecosystems. As a consequence, effective regulation of behavior requires creation of the most effective heuristics possible that are aligned with the community and individual’s intentions and values. Citation impact has a role to play in evaluation of scholarship. But to address the dominance of journal impact factor, the community needs other easy, applicable, heuristic indicators of quality or value. </p> <p>The contributors to this meeting will proffer complementary heuristics to impact factor that represent journal and funder policies and practices to promote openness and reproducibility. TOP offers a framework for developing and communicating such alternative heuristics, and the broad community support for TOP provides an opportunity to gain adoption of a credible process-based indicator. Transparency Factor would provide an easily understood mechanism to promote journals and funders that are leading the community toward more open and reproducibility research. An easy to understand indicator combined with an effective coalition to help others understand how improve their policies will inspire journals and funders to follow.</p> <p>The goals of the meeting are to:</p> <ol> <li> <p><strong>Define</strong> a Transparency Factor metric based on compliance with TOP principles and possibly other indicators of good practices in scholarly communication for transparency, openness, and reproducibility</p> </li> <li> <p><strong>Build a coalition</strong> of stakeholders that support and promote Transparency Factor as an alternative to Impact Factor evaluating quality of journal process rather than outcomes</p> </li> <li> <p>Define a <strong>marketing and adoption plan</strong> such that Transparency Factor achieves high visibility and sufficiently strong incentive to motivate journals, publishers, and funders to act on their expressed support for transparency</p> </li> <li> <p>Outline a community-based <strong>sustainability plan</strong> for continuous, iterative improvement of Transparency Factor based on feedback, verification of journal practices, evolving community norms, and applicability across research domains.</p> </li> </ol> <p>Over this two day meeting, stakeholders from across the research community that are enthusiastic to increase openness and reproducibility will meet to achieve these defined goals.</p> <p><strong>Group Membership</strong></p> <p>| 1 Content | 2 Scoring | 3 Valuation | 4 Sustainability | |---|---|---|---| | <strong>Omni Left</strong> | <strong>Omni Right</strong> | <strong>COS Aperi</strong> | <strong>COS Aberto</strong> | | Simine Vazire | Todd Carpenter | Maryrose Franko | Mark Parsons | | IJsbrand-Jan Aalbersberg | Emilio Bruna | Sowmya Swaminathan | Solomon Mekonnen | | Eric Eich | Stephen Curry | Stefano Bertuzzi | Tom Appleyard | | Len Freedman | Chris Graf | Heather Joseph | Ginny Barbour | | Sean Grant | Veronique Kiermer | Roshan Kumar | Sarah Brookhart | | Brooks Hanson | Carole Lee | Marcia McNutt | Michael Clarke | | David Mellor | Evan Mayo-Wilson | Alison Mudditt | Kay Dickersin | | David Moher | Meredith McPhail | Mark Patterson | Bianca Kramer | | Belinda Orland | Solange Santos | Jerry Sheehan | Alan Kraut | | Tim Parker | Dan Simons | Matt Spitzer | Aki MacFarlane | | Caroline Shore | Jeff Spies | Victoria Stodden | Maryann Martone | | Bobbie Spellman | Alan Tomkins | Deborah Sweet | Anne Tsui |</p>
OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.