Main content
On the Acquisition of Shared Grammatical Representations in Bilingual Language Models
Date created: | Last Updated:
: DOI | ARK
Creating DOI. Please wait...
Category: Project
Description: While crosslingual transfer is crucial to contemporary language models’ multilingual capabilities, how it occurs is not well understood. In this paper, we ask what happens to a monolingual language model when it begins to be trained on a second language. Specifically, we train small bilingual models for which we control the amount of data for each language and the order of language exposure. To find evidence of shared multilingual representations, we turn to structural priming, a method used to study grammatical representations in humans. We first replicate previous crosslingual structural priming results and find that after controlling for training data quantity and language exposure, there are asymmetrical effects across language pairs and directions. We argue that this asymmetry may shape hypotheses about human structural priming effects. We also find that structural priming effects are less robust for less similar language pairs, highlighting potential limitations of crosslingual transfer learning and shared representations for typologically diverse languages.
Files
Files can now be accessed and managed under the Files tab.
Citation
Recent Activity
Unable to retrieve logs at this time. Please refresh the page or contact support@osf.io if the problem persists.