Real world visual search targets are frequently imperfect perceptual matches to our internal target templates. For example, the same friend on different occasions is likely to wear different clothes, hairstyles, and accessories, but some of these may be more likely to vary than others. The ability to deal with template-to-target variability is important to visual search in natural environments, but we know relatively little about how feature variability is handled by the attentional system. In these studies, we test the hypothesis that top-down attentional biases are sensitive to the variance of target feature dimensions over time and prioritize information from less-variable dimensions.