Main content

Contributors:
  1. Peter Graff

Date created: | Last Updated:

: DOI | ARK

Creating DOI. Please wait...

Create DOI

Category: Project

Description: While published linguistic judgments sometimes differ from the judgments found in large-scale formal experiments with naive participants, there is not a consensus as to how often these errors occur nor as to how often formal experiments should be used in syntax and semantics research. First, we present results of a large-scale replication of Sprouse, Schütze, and Almeida (2013) on 100 English contrasts randomly sampled from Linguistic Inquiry 2001-2010 and tested in both a forced-choice experiment and an acceptability rating experiment. Like Sprouse, Schütze, and Almeida, we find that the effect sizes of published linguistic acceptability judgments are not uniformly large or consistent but rather form a continuum from very large effects to small or non-existent effects. We then use this data as a prior in a Bayesian framework to propose a Small N Acceptability Paradigm for Linguistic Acceptability Judgments (SNAP Judgments). This proposal makes it easier and cheaper to obtain quantitative and statistically valid data in syntax and semantics research. Specifically, for a contrast of linguistic interest for which a researcher is confident that Sentence A is better than Sentence B, we recommend that the researcher should obtain judgments from 7 unique participants, using 7 unique sentences of each type. If all 7 participants agree that Sentence A is better than Sentence B, then the researcher can be confident that the result of a full forced choice experiment would likely be 75% or more agreement in favor of Sentence A (with a mean of 93%). We test this proposal by sampling from the existing data and find that it gives highly reliable performance.

Files

Loading files...

Citation

Recent Activity

Loading logs...

OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.