Main content

Home

Menu

Loading wiki pages...

View
Wiki Version:
**Introduction** ---------------- Dynamic assessments, which emulate the learning process by having the examiner provide feedback and prompts to a child during the test, are a promising alternative to traditional static assessments, which measure only the learning product or acquired knowledge and are not characterized by prompting or feedback (Grigorenko & Sternberg, 1998). Such tests have been developed, tested and used with a wide variety of populations across multiple domains, but may be of particular use for at risk or bilingual groups who are disadvantaged based on exposure or experience with the language of testing in static measures (Vellutino et al., 1998; Samson & Lesaux, 2009). A 2009 review explored the predictive validity of dynamic assessment in multiple academic skills with 4 distinct groups, merging at risk and bilingual children together in their analysis (Caffrey et al., 2008). Results of this review and of included studies indicated this type of test demonstrated consistent predictive validity for determining later reading outcomes but less conclusive evidence to support the use of dynamic tools with their at risk bilingual group. We argue that grouping bilinguals and at risk populations together is problematic, as bilingual children are not inherently at risk because of their bilingualism, and should be separated from at risk groups to determine the true value of these types of test with each group. This idea is supported in a second review from 2019 explored the use of dynamic assessments for evaluating oral language skills in bilingual children, and found that these tests are promising for determining oral language difficulties with bilingual (Orellana et al., 2019). Therefore, this project intends to update and expand on these two previous reviews, by exploring the concurrent and predictive validity of dynamic assessments in six distinct populations which separate at risk, monolingual and bilingual populations, among others. **Hypotheses** -------------- We hypothesize that: 1. Regarding concurrent validity A) Dynamic assessments of decoding skills, phonological awareness and sound-symbol knowledge will demonstrate strong concurrent validity with static tests of decoding phonological awareness and sound-symbol knowledge. B) Dynamic assessments of decoding skills will demonstrate varying degrees of concurrent validity with static measures of decoding, phonological awareness and sound-symbol knolwedge in the 6 population groups (i) typically developing monolingual children, (ii) typically developing bilingual children (iii) monolingual children at risk for reading challenges (iv) bilingual children at risk for reading challenges, (v) monolingual children diagnosed with reading difficulties, (vi) bilingual children diagnosed with reading difficulties (e.g., a higher degree of concurrent validity between measures on group (i), vs other groups). 2. Regarding predictive validity A) Dynamic assessments of decoding skills, phonological awareness and sound-symbol knowledge will demonstrate strong predictive validity of later reading achievement outcome measures (single word and nonword reading and writing, as well as passage level decoding tasks). B) Dynamic assessments of decoding skills will demonstrate a similar degree of predictive validity with later reading achievement outcome measures in the 6 populations (i) typically developing monolingual children, (ii) typically developing bilingual children (iii) monolingual children at risk for reading challenges (iv) bilingual children at risk for reading challenges, (v) monolingual children diagnosed with reading difficulties, (vi) bilingual children diagnosed with reading difficulties (e.g., a similar degree of predictive validity across population groups). **Methods** ---------------------- Searches in Medline, Embase, PsychINFO and ERIC databases, as well as in grey literature sources like Google Scholar, OpenGrey, PsyArxiv and MedRxiv, will be conducted using two concepts (i) Dynamic Assessment, and (ii) Literacy. Only primary studies available in English, French or Spanish will be included. Studies must evaluate concurrent AND/OR predictive validity of dynamic assessment with children up to the age of 10 in the domain of decoding. Study management, including title and abstract screening, full text screening, data extraction and quality appraisal will be conducted in Covidence. **Projected Analyses** ------------------ Preliminary searches indicate that selected studies are likely to report associations in the form of correlation coefficients. For this reason, a correlational meta-analysis of study results is planned. This analysis will answer the overall questions about concurrent and predictive validity of these tests across groups. Should there be sufficient data, additional correlational analyses will explore the question of the tests' validity within each group. Should there be insufficient data within a population group, a narrative review of the findings will be provided. **Summary** ----------- This systematic review and meta-analysis aims to evaluate the validity of dynamic assessments of decoding skills both in terms of their concurrent validity with established static measures, and their predictive validity in determining future reading achievement outcomes. Validity will be explored for each type of dynamic test (decoding, phonological awareness and sound-symbol knowledge) across all populations, and with combined tests, specifically for 6 populations (i) typically developing monolingual children, (ii) typically developing bilingual children (iii) monolingual children at risk for reading challenges (iv) bilingual children at risk for reading challenges, (v) monolingual children diagnosed with reading difficulties, (vi) bilingual children diagnosed with reading difficulties, to determine which types of dynamic assessments of early literacy are most consistently valid, and for whom dynamic assessment of decoding might are most consistently concurrently and predictively valid.
OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.