Main content

Home

Menu

Loading wiki pages...

View
Wiki Version:
# README --- This repository contains files, documents, and information about the Statistics Teaching Inventory (STI). ![LASER Lab logo](https://files.osf.io/v1/resources/r4eag/providers/osfstorage/65cf7eae6d0cb8035e1a9707?mode=render =100x100) --- ### Description The Statistics Teaching Inventory (STI) is an instrument designed to assess the instructional practices and beliefs of instructors of introductory statistics courses. The STI was initially designed as part of the NSF-funded project [Evaluation and Assessment of Teaching and Learning about Statistics (e-ATLAS; NSF DUE-1044812 & 1043141)](https://www.nsf.gov/awardsearch/showAward?AWD_ID=1044812). ### Development Process The STI (v.1), which initially included 102 items, went through a rigorous development and evaluation process, including an online pilot administration with 101 voluntary USCOTS participants during the late spring and early summer of 2009. Cognitive interview data from 16 of the pilot respondents was also also collected and analyzed as part of the validation process. For more detail regarding the development of the STI instrument, as well as, analyses of the pilot administration data, see Zieffler et al. (2012). Based on the pilot and interview data collected, the STI was revised to include 87 items; STI (v.2). It was at this time that items corresponding to the same content were grouped together into seven sections, namely: (1) Pedagogy; (2) Curricular emphasis; (3) Technology; (4) Assessment; (5) Beliefs; (6) Course characteristics; and (7) instructor/institution characteristics. #### STI (v.2) Initially, the STI was developed for instructors teaching in a face-to-face format. Given the popularity of online and hybrid courses, limiting the administration of the STI to face-to-face instructors was a severe limitation if the instrument was to be used in the type of large, national study proposed in the e-ATLAS project. In addition, items on the STI were written for instructors of courses which did not have a recitation section being led by a teaching assistant. This also limited the potential sample. To overcome these limitations, four different forms of the STI were developed for varying instructional settings: - Face-to-face course (no lab/recitation session led by a teaching assistant); - Face-to-face course (with lab/recitation session led by a teaching assistant); - Completely online course; and - Hybrid course (mixture of face-to-face and online). The first form developed was for instructors of face-to-face courses with no lab/recitation sessions (Form 1). The form was sent to an expert reviewer, a statistics education researcher with much experience in assessment research, and was revised various times based on feedback. This form was then adapted for courses that included lecture/recitation sessions (Form 2). The adaptations for this form primarily included small changes to item stems. For example, the items that asked about "the instructor” were changed to “the instructor or TA". Four additional items were added to the *Pedagogy* section of this form that asked about the time spent on certain teaching methods during the recitation section. Lastly, the item responses for one item from the face-to-face form were expanded to allow for the potential of a broader TA role in these courses. To create the forms for completely online courses (Form 3) and hybrid courses (Form 4), the initial face-to-face form was again modified. Most of the adaptations for these forms, again, came from changes to the item stems. The largest modifications were made to an item asking about the frequency with which course content was presented primarily via lecture. For Form 3, this item was changed from “lecture” to “audio or video lectures” to better reflect practices in the online environment. An additional item was also included to Form 3 that asked whether the course content was presented primarily via readings. Form 4 kept the same item as the face-to-face form, and also included the two additional items from Form 3. ![STI Table](https://osf.io/download/k4jvh/?direct%26mode=render) In the summer of 2012, each of the four forms was administered during an online pilot study. A total of nine instructors (at three different institutions) piloted the instrument. Three instructors took the face-to-face (with no lab/recitation) form, one took the face-to-face with a lab/recitation form, three took the completely online form, and two instructors took the hybrid course form. Each of these respondents also provided detailed comments and feedback on specific questions and on the instrument as a whole. Based on this feedback, revisions were made to each form. These revisions were to shorten the instrument and add clarification to some of the items. After these revisions were made, a statistics education expert, who had also taken part in the piloting process, reviewed the four forms. The feedback from this review led to a few more minor revisions of each form. All four forms were finalized during August of 2012 and formatted online using the survey platform Qualtrics. An initial item related to format of the course was also added. After the instructor indicated course format, the online instrument prompted the instructor to the appropriate form. Data from the administration of STI (v.2) is presented in Fry et al. (2014). #### STI (v.3) The third version of the instrument (STI v.3) went back to a single form given to all instructors, regardless of course modality. Items asking about the course modality were included to capture differences. In 2019, the STI (v. 3) was again adapted to focus only on the teaching practices of statistics instructors—all beliefs items were dropped from the instrument. Additional items related to computation and modern data practices were also included on the instrument, as well as items tapping recommendations in the 2016 GAISE document. ### Publications and Presentations Fry, E. B., Garfield, J., Pearl, D., Zieffler, A., & delMas, R. (2014). [Statistics Teaching Inventory: Report of data collected in 2013.](https://osf.io/download/6610aa04219e712bc8f6ae0c/?direct%26mode=render) Unpublished executive report. University of Minnesota, Minneapolis. Legacy, C., Le, L., Zieffler, A., Fry, & Vivas Corrales, P. (2024) The teaching of introductory statistics: Results of a national survey. *Journal of Statistics and Data Science Education.* <https://doi.org/10.1080/26939169.2024.2333732> Vivas Corrales, P., Legacy, C., Le, L., Zieffler, A., & Fry, E. (2023, June). [Statistics Teaching Inventory: Exploring the alignment of introductory statistics instructors' teaching and assessment practices to professional recommendations.](https://osf.io/download/6610ad41e65c6029da7d9c83/?direct%26mode=render) Poster presented at the Research Satellite of the United States Conference on Teaching Statistics, State College, PA. Zieffler, A., Park, J., Garfield, J., delMas, R., & Bjornsdottir, A. (2012). The Statistics Teaching Inventory: A survey on statistics teachers’ classroom practices and beliefs. *Journal of Statistics Education, 20*(1), 1-29. <http://jse.amstat.org/v20n1/zieffler.pdf>
OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.