Item analysis: How to calculate the relative information entropy of test and survey items in R?

Contributors:

Date created: | Last Updated:

: DOI | ARK

Creating DOI. Please wait...

Create DOI

Category: Project

Description: One possibility to examine the variability of a test or survey item and express it in a single value is the computation of the relative information content. The relative information content (also called relative entropy) is a dispersion measure for at least nominally scaled variables. But it also can be calculated on higher scale levels.

Wiki

A basic requirement for test and survey items is that they are able to detect variance with respect to a latent variable. To do this, item scales must discriminate between test subjects and must have a systematic, clear and sufficiently strong relationship with the underlying construct. One possibility to examine the variability of an item and express it in a single value is the computation of the...

Files

Loading files...

Citation

Tags

Recent Activity

Loading logs...

OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.