Files | Discussion Wiki | Discussion | Discussion

default Loading...

04. Unconference Sessions

Menu

Loading wiki pages...

View
Wiki Version:
<p><strong>SIPS 2017 UNCONFERENCE SESSIONS</strong></p> <p>Unconference sessions will be officially scheduled in the evening organizing session on July 29th. In preparation for that, you can share ideas you have for session topics here. The purpose of sharing these in advance is to identify people with shared interests so that they can be combined into a single session rather than producing lots of duplication in the scheduled unconference topics.</p> <p>Unconference sessions are intended to be conversations of people with similar interests. The organizer of an unconference session might seed the conversation with a short example, illustration, or idea, but they should NOT give a presentation. These are group participation events. To identify people with common interests, review the possible unconference topics below and either add your name to an existing topic or add a new topic with your name. This list of topics can be used as fodder for scheduling at the meeting. Expressing interest in a topic does not commit you to organizing or attending an unconference session on that topic.</p> <hr> <p>TOPIC: <strong>Can triggers be used effectively for promoting cultural change? (e.g., I will adopt open data for my papers when 500 other psychologists make the same commitment.)</strong></p> <p>INTERESTED: Brian Nosek (nosek@virginia.edu)</p> <hr> <p>TOPIC: <strong>How can we make psychology a more cumulative science?</strong> Many theories "belong" to an individual or group, which brings a host of difficulties. How can we change incentive structures in science to make it easier for the community to collaboratively test theories without becoming attached to them? </p> <p>INTERESTED: Brad Wyble (bwyble@gmail.com), Traci Mann (mann@umn.edu), Roger Giner-Sorolla (rsg@kent.ac.uk), Lee Jussim (leej12255@gmail.com), Simon Columbus(simon@simoncolumbus.com), Ben Levy (bjlevy3@usfca.edu)</p> <hr> <p>TOPIC: <strong>Efficient and inexpensive direct replications in Indonesia.</strong> An open psychology institute (<a href="https://igdore.org/indonesia/" rel="nofollow">Igdore Indonesia</a>) will soon open on Bali. Direct replications can be conducted remotely by foreign researchers at cost price: one full time research assistant is &lt; 400 USD/month (very good salary; insurances & taxes included). Bali is well traveled by WEIRD (Western, Educated, Industrialized, Rich, Democratic) populations; we can recruit Indonesians and foreigners. Let’s discuss how we can use Igdore Indonesia to get more replications done. </p> <p>INTERESTED: Rebecca Willén (rebecca.willen@igdore.org)</p> <hr> <p>TOPIC: <strong>What would an IRB and the review process look like if we could build a new IRB completely from scratch?</strong> This is on Igdore Indonesia’s to-do-list at the moment. What to include in the application form? What questions to include about open practices? How many board members to have and with what backgrounds? Interested in being a member of the <a href="https://igdore.org/indonesia/irb/" rel="nofollow">IRB</a>? How to make an efficient process with virtual board meetings? Things to consider before publishing the ethical applications and board decisions online? </p> <p>INTERESTED: Rebecca Willén (rebecca.willen@igdore.org)</p> <hr> <p>TOPIC: <strong>Promoting computational reproducibility in psychology</strong></p> <p>Computational reproducibility means that someone else can take your data and get the same numbers as you write in your paper. As journals move towards more data sharing, we need to think about how to share code or documentation of analyses as well, especially given the different platforms used by different labs. <a href="https://twitter.com/mcxfrank/status/886281562399981568" rel="nofollow">Here's a tweetstorm on this topic, for inspiration.</a></p> <p>INTERESTED: MIKE FRANK (mcfrank@stanford.edu), Brad Wyble (bwyble@gmail.com), Rebecca Willén (rebecca.willen@igdore.org), Julia Rohrer (julia.rohrer@uni<a href="http://-leipzig.de" rel="nofollow">-leipzig.de</a>), Mara Sedlins (mara.sedlins@duke.edu), Simon Columbus(simon@simoncolumbus.com)</p> <hr> <p>TOPIC: <strong>Facilitating scientific self-correction in psychology.</strong></p> <p>Psychology claims (or, perhaps aspires) to be a "science," and one of the supposed characteristics that distinguishes science from other approaches to understanding is that sciences (supposedly) self-correct. Strong self-correction (such as when ulcers were understood to be caused by bacteria, not stress) involves a field being ready, willing, and able to declare: 1. Our old belief was X. 2. X was completely wrong. The correct belief is not "X under some conditions," it is not, "well, a bit of X, a bit of Y, a bit of Z." 3. The correct belief is NOT X, instead Y. What are the obstacles to strong self-correction in psychology, how can they be undermined, and how can effective self-correction (when warranted) be facilitated? (Possibly combinable with the cumulative science proposal above)</p> <p>INTERESTED: LEE JUSSIM (leej12255@gmail.com), Roger Giner-Sorolla (rsg@kent.ac.uk).</p> <hr> <p>TOPIC: <strong>Statistical Estimation of Replicability and Publication Bias</strong></p> <p>One major problem in psychological science is that low statistical power combined with selection for significance leads to inflated effect sizes and low replicability. To address this problem a number of statistical methods have been developed to assess the presence of publication bias and replicability. The usefulness of these methods will be discussed.</p> <p>INTERESTED: Ulrich Schimmack (ulrich.schimmack@utoronto.ca); Marcel van Assen (m.a.l.m.vanassen@uvt.nl); Roger Giner-Sorolla (rsg@kent.ac.uk); Lee Jussim (leej12255@gmail.com)</p> <hr> <p>TOPIC: <strong>Examining and promoting quality of open data</strong></p> <p>Ever experienced that promised open data were not delivered, or that only summary data were provided? After having experienced this several times already, we of the meta-research centre at Tilburg University feel that this is the time to (i) examine prevalence of promises of open data, (ii) examine quality of promised open data in several journals, (iii) provide guidelines for published open data to be used by both authors and journal editors. </p> <p>INTERESTED: Marcel van Assen (m.a.l.m.vanassen@uvt.nl), Mara Sedlins (mara.sedlins@duke.edu), Jordan Axt (jordan.axt@gmail.com)</p> <hr> <p>TOPIC: <strong>Questionable Research Practices (QRP): A Board/Card Game</strong></p> <p>I (RGS) have finally crossed my experience as a tabletop game player and designer with my interest in science reform. My game design, QRP, is meant to entertain and teach about issues in research reporting and scientific careers. The basic play of the game simulates questionable practices (selective reporting) and the possibility of fraud through bluffing. You draw cards from a deck to populate experiments with positive and negative results, which are then submitted for publication in a market that favors positive results. Your report is taken on faith but may be challenged as fraudulent. Players have secret scoring agendas: either to advance their careers, find the truth about the composition of the deck, or support the "positive" or "negative" side of the argument. As the game progresses, more reforms are instituted and it becomes less necessary for the careerists to fudge data to get ahead. I hope to bring a prototype and rules outline to the conference for discussion and a little guided playtest.</p> <p>INTERESTED: Roger Giner-Sorolla (rsg@kent.ac.uk) Brad Wyble (bwyble@gmail.com)</p> <hr> <p>TOPIC: <strong>Best practices for secondary data analyses, "Replicable Research for Data Parasites"</strong><br> Lots of recent developments such as pushes towards preregistration and data sharing are focused on PIs running their own experiments, collecting their own data. However, a considerable number of psychologists work with existing data sets such as large-scale panel studies and thus face (somewhat) different challenges.</p> <p>INTERESTED: Julia Rohrer (julia.rohrer@uni<a href="http://-leipzig.de" rel="nofollow">-leipzig.de</a>), Mara Sedlins (mara.sedlins@duke.edu), Lisa Hoplock (lisa.hoplock@umanitoba.ca), Simon Columbus(simon@simoncolumbus.com), Jessica Flake (kayflake@gmail.com)</p> <hr> <p>TOPIC: <strong>Finding open materials once we share them: We need a central meta-database for experimental stimuli</strong></p> <p>The more we share materials online, the more of those materials there are. We have to grapple with the problem of making them findable and usable, but if we can solve it, we can open up whole new avenues for cumulative science: stimuli can be used across studies and populations (for replication & more), validated for their intended purposes, and used for novel ones. Creating appropriate stimuli is hard, and sometimes unnecessary - reusing them saves effort. </p> <p>We're the experts, we often know what is important to know about a stimulus or set of them, but we don't have a common language or venue to share them with each other. Let's make one!</p> <p>INTERESTED: Melissa Kline (melissa.e.kline@gmail.com), Rebecca Willén (rebecca.willen@igdore.org), Simon Columbus(simon@simoncolumbus.com)</p> <hr> <p><strong>An organizational taxonomy for pre-prints on PsyArXiv</strong></p> <p>OSF and the PsyArXiv Steering Committee need your help creating a custom taxonomy for organizing manuscripts on PsyArXiv. With the help of SIPS attendees, we plan to map out a hierarchical framework of the disciplines of psychological science. This will improve the user experience on PsyArXiv by allowing for more intuitive navigation of manuscripts by subject matter.</p> <p>INTERESTED: David Condon (david-condon@northwestern.edu), Ben Brown (bbrown6@ggc.edu), Nici Pfeiffer (nici@cos.io), Lisa Hoplock (lisa.hoplock@umanitoba.ca), Simon Columbus(simon@simoncolumbus.com), Melissa Kline (melissa.e.kline@gmail.com)</p> <hr> <p>TOPIC: <strong>Facilitating multi-site collaborations</strong><br> In this session, we plan to brainstorm ways that researchers can most effectively engage in a range of cross-lab collaborations and other crowdsourced research projects. Topics could include: increasing sample sizes, concurrent replications, pre-publication independent replications, cross-cultural investigations, etc.</p> <p>INTERESTED: Christopher Chartier (cchartie@ashland.edu), Randy McCarthy (rmccarthy3@niu.edu) </p> <hr> <p>TOPIC: <strong>Where do people end up?</strong> OS and SIPS are new enough, and the job market might fluctuate enough, that we probably can't make good guesses about the chances students and ECPs have of ending up in any given career path. Gathering data about what sorts of jobs/careers SIPS attendees have (and get, over time) could help new people interested in OS plot their careers or find people to help them get started in the field.</p> <p>Unsession Notes: <a href="http://tinyurl.com/ybr2oong" rel="nofollow">http://tinyurl.com/ybr2oong</a></p> <p>INTERESTED: Alex Uzdavines (xander211@gmail.com)</p>
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.

Create an Account Learn More Hide this message