In multiattribute inferences, cross-validation studies of out-of-sample
accuracy have shown that fast and frugal heuristics such as take-the-best
often outperform weighted additive models such as Multiple Regression, Naïve
Bayes and Tally (Brigthon & Gigerenzer, 2009). From a prescriptive
perspective, the results imply that decision accuracy may actually decrease
when attributes are searched beyond the most valid and discriminating. Lee
and Zhang (2012) showed that especially in redundant (positively correlated)
environments, the evidence gained from an exhaustive search of all
attributes will frequently predict the same option as the evidence gained
from the frugal search of the most valid discriminating attribute. In the
present study, we assessed how (1) simple and cognitively plausible
limitations of information search (2 to 6 most valid attributes), and, (2)
attribute redundancy, affect the relative out-of-sample performance of
strategies. The strategies were cross-validated on 15 real-world data sets
studied previously, and on simulated multivariate normal data with
redundancy manipulated via the covariance matrix. Our results show that
under high redundancy, Naïve Bayes and Tally with limited search show
similar accuracy as take-the-best, whereas the letter perform relatively
worse under low redundancy. We discuss the prescriptive implications for
information search, search termination and information combination in
multiattribute inferences.