Main content

Home

Menu

Loading wiki pages...

View
Wiki Version:
In Fall of 2017, we conducted a [pre-assessment][1] of prior knowledge for EUREKA, but we were unable to compare the answers to anything on the [post-assessment][2]. We did not have a pre-assessment for PROPEL 2018. The post-assessments for EUREKA (see file [here][3]) and PROPEL (see file [here][4]) included self-reported prior knowledge and student attitudes on the value of the lessons. In Fall of 2018, we decided to use a learning gains assessment that did not include the biases associated with self-reporting and attitude and value judgements. We collaborated with the Eberly Center, CMU's teaching and learning hub, to design a counter-balanced assessment tool for measuring learning gains from our instruction. The counter-balanced design controls for both differences in the difficulty in the questions in the pre-and-post assessments and order effects. We first created two versions of the assessment, A & B, that had questions that tested the same concepts but were worded differently. We then gave half of the students [Assessment A][5] at the beginning of the lecture, followed by [Assessment B][6] at the end of the recitation section later that week. The other half of the students took Assessment B first, followed by A. The first three questions are the same on the two versions of the assessment and were meant to assess whether students retained information that they learned over the summer in the required online module on computing skills, C@CM. The assessments were integrated into Canvas, the university’s learning management system, and the students received immediate feedback. There was no statistical difference in the percent of students correctly answering each question in Assessment A & B, suggesting that the two assessments were of similar difficulty. We also did learning objective mapping to determine the learning gains for each learning objective. See the attached [spreadsheet][7] to see the learning gains for our assessment questions (Tab 2) and learning objectives (Tab 1) and how the assessment questions mapped onto the learning objectives (Tab 2). [1]: https://osf.io/df85t/ [2]: https://osf.io/fkj6b/ [3]: https://osf.io/fkj6b/ [4]: https://osf.io/vmxtz/ [5]: https://osf.io/6udng/ [6]: https://osf.io/dcuj7/ [7]: https://osf.io/brvck/
OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.