One of the most widespread views in vision research is that top-down control over visual selection is achieved by tuning attention to a particular feature value (e.g., red/yellow). Contrary to this view, previous spatial cueing studies showed that attention can be tuned to relative features of a search target (e.g., redder): An irrelevant distractor (cue) captured attention when it had the same relative color as the target (e.g., redder), and failed to capture when it had a different relative color, regardless of whether the distractor was similar or dissimilar to the target. The present study tested whether the same effects would be observed for eye movements when observers have to search for a color or shape target and when selection errors were very noticeable (resulting in an erroneous eye movement to the distractor). The results corroborated the previous findings, showing that capture by an irrelevant distractor does not depend on the distractor's similarity to the target but on whether it matches or mismatches the relative attributes of the search target. Extending on previous work, we also found that participants can be pretrained to select a color target in virtue of its exact feature value. Contrary to the prevalent feature-based view, the results suggest that visual selection is preferentially biased toward the relative attributes of a search target. Simultaneously, however, visual selection can be biased to specific color values when the task requires it, which rules out a purely relational account of attention and eye movements.