Main content
Item analysis: How to calculate the relative information entropy of test and survey items in R?
Date created: | Last Updated:
: DOI | ARK
Creating DOI. Please wait...
Category: Project
Description: One possibility to examine the variability of a test or survey item and express it in a single value is the computation of the relative information content. The relative information content (also called relative entropy) is a dispersion measure for at least nominally scaled variables. But it also can be calculated on higher scale levels.
Add important information, links, or images here to describe your project.