Main content

Home

Menu

Loading wiki pages...

View
Wiki Version:
We experimentally test two information-theoretic measures of relevance: Entropy reduction (van Rooy, 2004; Rothe et al. 2018), and KL-divergence (Nelson et al., 2010; Hawkinset al., 2015). Results show that KL-divergence fits introspective relevance judgments better than entropy reduction. However, neither measure is adequate on its own.
OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.