Academic journal article Attention, Perception and Psychophysics

Temporal Dynamics of Unimodal and Multimodal Feature Binding

Academic journal article Attention, Perception and Psychophysics

Temporal Dynamics of Unimodal and Multimodal Feature Binding

Article excerpt

In two experiments, we studied the temporal dynamics of feature integration with auditory (Experiment 1) and audiovisual (Experiment 2) stimuli and manual responses. Consistent with previous observations, performance was better when the second of two consecutive stimuli shared all or none of the features of the first, rather than when only one of the features overlapped. Comparable partial-overlap costs were obtained for combinations of stimulus features and responses. These effects decreased systematically with increasing time between the two stimulus-and-response events, and the decreased rate was comparable for unimodal and multimodal bindings. General effect size reflected the degree of task relevance of the dimension or modality of the respective feature, but the effects of relevance and of temporal delay did not interact. This suggests that the processing of stimuli on task-relevant sensory modalities and feature dimensions is facilitated by task-specific attentional sets, whereas the temporal dynamics might reflect that bindings "decay" or become more difficult to access over time.

One of the challenges human perception poses is understanding how the brain binds codes of features within and across sensory modalities, despite these codes' being processed in various cortical areas (e.g., Goldstein, 2007; Wessinger et al., 2001; Zeki & Bartels, 1999). This socalled binding problem was investigated initially in the visual domain (see, e.g., Allport, Tipper, & Chmiel, 1985; Kahneman, Treisman, & Gibbs, 1992; Treisman & Gelade, 1980), then in the auditory domain (see, e.g., Dyson & Quinlan, 2004; Hall, Pastore, Acker, & Huang, 2000; Takegata et al., 2005), and more recently across modalities, such as vision, audition, and taction (Zmigrod, Spapé, & Hommel, 2009). The available evidence suggests that binding mechanisms operate both within and across modalities and seem to bind perceptual features, regardless of their origin.

Moreover, sequential-effects studies provide evidence that response-related features are also integrated with stimulus features into what Hommel (1998, 2004) has called event files-that is, integrated episodic traces of all the perceptual and action features related to a particular event. In these sequential-effects studies, participants typically carry out two responses in a row (see Figure 1). First, they see a response cue that signals the first response (R1), which, however, is to be carried out only after a trigger stimulus (S1) is presented. After a short stimulus onset asynchrony (SOA) or response-stimulus interval (RSI), the second stimulus (S2) appears and calls for a binarychoice response to one of its features (R2). Similar to the findings from visual and auditory studies, main effects of stimulus-feature repetition were obtained. Yet more interesting, interactions between different stimulus-feature repetition effects and between stimulus-repetition and response- repetition effects were observed for visual features and response (Hommel, 1998, 2005), auditory features and response (Mondor, Hurlburt, & Thorne, 2003; Zmigrod & Hommel, 2009), and tactile features and response (Zmigrod et al., 2009).

These observations suggest that stimulus and response features are spontaneously integrated into multimodal event files, which are retrieved whenever at least one feature repeats. Assume, for instance, that S1 and S2 consist of varying combinations of auditory pitch (low vs. high) and visual color (red vs. blue), as in Zmigrod et al. (2009) study. Findings show that a complete repetition of both features (e.g., S1 = blue + low [arrow right] S2 = blue + low) or a complete alternation (e.g., red + high [arrow right] blue + low) produces better performance than do partial repetitions (e.g., red + low [arrow right] blue + low, or blue + high [arrow right] blue + low). This suggests that the combination presented as S1 is automatically integrated and retrieved upon repetition of any feature. …

Search by... Author
Show... All Results Primary Sources Peer-reviewed

Oops!

An unknown error has occurred. Please click the button below to reload the page. If the problem persists, please try again in a little while.