The Colavita visual dominance effect is a psychological observation named after Francis B. Colavita, the psychologist who first gathered evidence of its existence in 1974.
Colavita observed that when human adults are presented with visual stimuli and other sensory stimuli (e.g., tactile or auditory) at the same time, they respond more to the visual stimuli and often fail to respond to the other sensory stimuli entirely.
The findings gathered by Colavita suggest that vision is the most dominant sense for most humans who are not visually impaired. While some studies suggest that in certain cases (e.g., when they are facing potential threats), some animals and humans can become more reliant on auditory stimuli, the occurrence of the Colavita effect in situations that are non-threatening and “emotionally neutral” is now well-documented.
More recently, some psychologists have found that while adults tend to respond more to visual stimuli, the Colavita effect might not apply to children. Contrarily to adults, in fact, children appear to be more reliant on auditory stimuli when experiencing the world around them.
Researchers at Durham University in the U.K. have recently carried out a study investigating this effect, known as the reverse Colavita effect, in children of different ages. Their paper, published in Elsevier’s Journal of Experimental Child Psychology, reported interesting new findings, suggesting that when they are trying to grasp emotional aspects of their experience, children tend to focus more on auditory stimuli than on visual stimuli.
“In the ’70s, scientists found that when presented with simultaneous flashes of light and auditory tones, adults showed a visual dominance and reported the visual flashes, which is now known as the Colavita effect,” Dr. Paddy Ross, one of the researchers who carried out the study, told Medical Xpress.
“In children, the opposite was true – they showed an auditory dominance and reported the tones (known as the reverse Colavita effect).
This held for some more complex sematic stimuli (pictures of animals, noises, etc.) but we wanted to know if it would still hold when using emotional information.”
In their experiments, Dr. Ross and his colleagues used two datasets compiled by other teams of researchers and widely used in psychological research: the emotional body stimuli (BEAST) dataset and the emotional non-verbal vocalizations (MAV) dataset.
They recruited 139 participants and divided them into three groups based on their age: one group of children up to seven years old, one group of older children (eight to 11 years old) and one group of adults (18 or above).
The researchers presented all participants with pairs of audio recordings and images of body postures conveying four primary emotions (i.e., joy, sadness, anger and fear) and asked them to describe what emotion they perceived from the stimuli.
In some cases, an audio recording matching the emotion presented in the image was presented at the same time. In other cases, however, the two stimuli were incongruent (e.g., an image of a happy person was paired with the recording of a sad non-verbal vocalization).
When a pair of stimuli was incongruent, participants were asked to either ignore the image and base their response on the audio recording, or vice versa. Moreover, all participants were presented with exactly the same pairs of stimuli to improve the experiment’s validity and prevent individual stimuli from influencing the results.
“We found that all age groups (under 8, 8-11, 18+) could easily ignore the image and focus on the voice,” Dr. Ross explained. “However, children found ignoring the voice extremely challenging. They performed below chance several times, so they weren’t simply guessing; the emotion of the voice was influencing their perception of the emotional body posture.”
Dr. Ross and his colleagues were the first to report evidence of auditory dominance in children in the context of emotional expression. Their findings could soon inspire new studies examining the extent of this effect further (i.e., how much auditory stimuli affect how a child understands what is happening in his/her surroundings).
“Our study has several important implications, as it suggests that when a parent is communicating with a child and trying to hide anger or frustration with a smile, it might not matter,” Dr. Ross said. “In other words, ‘putting on a happy face’ when one is sad, for example, is unlikely to convince a child unless your voice sounds happy, too.”
According to Dr. Ross, these new findings could also have implications for teaching and education. In fact, due to the COVID-19 pandemic, many children are currently studying from home, where they might be more exposed to auditory distractions.
The observations reported in the study hint to the possibility that emotion-related stimuli in a child’s home (e.g., programs about COVID-19 on TV, family members arguing, etc.) could influence how a child engages with or perceives his/her schoolwork.
“We have several studies lined up to see how far we can push the effect we observed,” Dr. Ross added. “For example, we will be adding emotional faces into the mix and running another version of the experiment using emotional music instead of vocalizations. It could be the case that any emotional stimuli could be sufficient to influence a child’s visual perception, it might not even need to be human.”
When confronted with stimuli coming from different sensory modalities, humans often rely on the modality that is most precise or accurate for the given task (see, e.g., Ernst & Bülthoff, 2004; Welch & Warren, 1980; but see also Battaglia, Jacobs, & Aslin, 2003).
Both everyday experience and the available empirical evidence support the widely held impression that vision is typically the dominant sensory modality for humans in many situations (Posner, Nissen, & Klein, 1976; Rock & Harris, 1967; Rock & Victor, 1964; see also Cooper, 1998; Hohnsbein, Falkenstein, Hoormann, & Blanke, 1991; Klein, 1977; Quinlan, 2000), although it is possible, under certain conditions, to demonstrate sensory dominance by the auditory and/or somatosensory systems (Ernst & Banks, 2002; Ernst, Banks, & Bülthoff, 2000; Lederman & Abbott, 1981; Morein-Zamir, Soto-Faraco, & Kingstone, 2003; Sekuler, Sekuler, & Lau, 1997; Shams, Kamitani, & Shi- mojo, 2000).
Posner et al. (1976) argued that visual dominance might represent a by-product of attentional processes, hypothesizing that humans have a strong tendency to actively (i.e., endogenously) attend to visual events as a means of compensating for the poor alerting properties of the visual system (in comparison with the auditory or tactile system; see also Klein, 1977; Spence, Nicholls, & Driver, 2001; Spence, Shore, & Klein, 2001).
The dominance of the visual modality is not confined to humans. Shapiro, Jacobs, and LoLordo (1980) suggested that many other species may also be visually dominant under normal (i.e., nonaroused) conditions, since a majority of biologically important information is received visually.
Indeed, visual dominance effects over audition have now been reported in cows (Uetake & Kudo, 1994), pigeons (Foree & LoLordo, 1973; Kraemer & Roberts, 1985; Randich, Klein, & LoLordo, 1978), and rats (Melt- zer & Masaki, 1973) with the use of food acquisition pro- cedures (operant conditioning; see also Partan & Marler, 1999).
Interestingly, both animal and human subjects ap- pear to “switch” their attention more toward the auditory modality under conditions of high arousal in order to react more rapidly to potential threats. For example, audition appears to be dominant for controlling avoidance behaviors, such as preventing electric shocks (see, e.g., Foree & LoLordo, 1973; Gilbert, 1969; Shapiro et al., 1980). Thus, in the case of animal research, vision seems to be dominant for certain behaviors, such as appetitive behav- iors, whereas audition appears to be more dominant for others, such as avoidance behaviors.
The fact that visual dominance in animals can be reduced under the appropriate behavioral conditions suggests a role of attention in explaining why vision appears to be the dominant sense under normal conditions, supporting Posner et al.’s (1976) original proposition.
Colavita (1974) described one of the most dramatic examples of visual dominance, or prepotency (see Posner, Nissen, & Ogden, 1978; Welch & Warren, 1986, for early reviews of the phenomenon of visual dominance). In Co- lavita’s study, participants were asked to press one button whenever they heard a tone, and another button whenever they saw a light.
In the majority of trials, only one stimulus (a tone or a light) was presented unpredictably, and, as one would expect, participants responded both rapidly and accurately. A few trials (5 out of 35) interspersed throughout the experiment were bimodal, consisting of the simultaneous presentation of the tone and the light. Strikingly, in these bimodal trials, participants almost always failed to respond to the sound, pressing only the visual response button on 49 of 50 trials, across all 10 participants tested in the experiment.
Furthermore, a number of the partici- pants reported that they did not even hear the auditory stimulus on the bimodal test trials on which they had re- sponded to the light (16 of 49 trials). This tendency to re- spond only to the visual event was particularly surprising given that response latencies for unimodal auditory and visual stimuli were equivalent (297 vs. 299 msec, respec tively). Moreover, when presented in separate blocks of experimental trials (i.e., under conditions of focused at- tention), participants typically responded more rapidly to the auditory stimulus than to the visual stimulus (Experi- ment 1, 179 vs. 197 msec, respectively).
In his pioneering study, Colavita (1974) informed participants that the rare bimodal trials occurring during the course of the experimental session were “accidental” and gave no specific instruction about how participants should respond on such trials—even going so far as to apologize to the participants for their very occurrence!
Although this was done in an attempt to keep participants from realizing the true purpose of the experiment, it does raise the possibility that task demands and/or experimenter ex- pectancy effects may have contributed to the pattern of results obtained (i.e., Intons-Peterson, 1983; Orne, 1962; Pierce, 1908).1
Moreover, in subsequent follow-up experi- ments (see, e.g., Egeth & Sager, 1977), the auditory and visual stimuli were presented from different spatial locations. Consequently, spatial attention may have played a role in determining the results, given the possible pref- erence of observers for responding to stimuli presented at or near fixation (usually the visual event), rather than to stimuli presented elsewhere, such as inside the head (when presented over headphones) or from peripheral loudspeakers.2
The present study had two goals.
The first was to extend the Colavita (1974) visual dominance effect to more complex situations. The majority of studies that have in- vestigated the Colavita effect have used comparatively simple events—such as brief auditory beeps and light flashes—and have required that subjects passively wait for the occurrence of a stimulus requiring imperative re- sponse (see Egeth & Sager, 1977; Hohnsbein et al., 1991; Quinlan, 2000; Shapiro, Egerman, & Klein, 1984; Sha- piro & Johnson, 1987).
We extended the investigation of visual dominance to situations involving the presentation of more complex stimuli in a task involving the search for predetermined targets (pictures and sounds) embed- ded amongst a stream of distractor stimuli. Our reasoning was that this procedure should increase the overall percep- tual load of the task (Lavie, 2005) in comparison with the simple detection of beeps and flashes used previously (see Colavita, 1974), perhaps representing everyday life multi-sensory contexts more accurately (see also Basil, 1994).
This brings us to the second goal of the present study, which was to address the role that attention may play in visual dominance. Posner et al. (1976) tried to explain vi- sual dominance as a by-product of the attentional system’s compensating for the poor alerting abilities inherent in the visual system.
However, in a series of experiments, Cola- vita and colleagues (Colavita, 1974; Colavita, Tomko, & Weisberg, 1976; Colavita & Weisberg, 1979) claimed that the phenomenon actually has a sensory basis instead, occurring regardless of the allocation of a participant’s attention.
As an improvement over Colavita’s (1974) original ex- perimental design, participants in the present study were provided with three response keys corresponding to audi- tory, visual, and bimodal targets. The participants in all previous studies of the visual dominance effect by Colavita (1974; Colavita et al., 1976; Colavita & Weisberg, 1979) were provided with just two response keys (auditory and visual), thereby leaving open the possibility of motor interference on bimodal trials, in which participants were instructed to press both response keys at the same time.
By providing a third key for bimodal responses, any possible role of motor conflict at the response selection stage was reduced. Finally, we also eliminated the potential confounds introduced by the spatial layout of the stimuli in previous studies by presenting our auditory and visual events from the same spatial location.
Somewhat confusingly, the exact measure of visual dominance in the paradigm introduced by Colavita has varied across subsequent studies. In Colavita’s (1974, p. 411; Colavita et al., 1976, p. 25) original studies, the visual dominance effect was clearly defined as the failure of participants to respond to a sound when it was paired with a visual stimulus that would otherwise elicit a re- sponse when presented in isolation.
Although Colavita (1974, p. 411) also reported a trend across all experiments for auditory responses to be faster than visual responses (185 vs. 197 msec, respectively) when presented in sepa- rate (unimodal) experiment blocks, the error data can be interpreted clearly as long as the visual response latencies are not significantly faster than the auditory response latencies.
Egeth and Sager (1977) introduced a second defini- tion of visual dominance.
In their study (Experiment 2), response latencies were compared between auditory and visual erroneous responses to bimodal targets, showing that, as RTs to auditory errors on bimodal trials were longer, the presence of the visual stimulus interfered in the perceptual processing of the auditory stimulus.
Egeth and Sager (1977) also presented additional data in line with Colavita’s (1974): When presented in separate blocks— that is, in conditions of focused attention—response latencies were shorter for auditory targets in comparison with visual targets; whereas, when mixing auditory and visual targets in the same block, response latencies did not differ statistically.
In all of the experiments reported here, unimodal auditory and visual targets were interspersed within the same block. Therefore, the extent of visual dominance will be gauged in terms of the percentage of visually based errors relative to auditory-based errors on bimodal trials.
In Experiment 1, we tested the basic Colavita visual dominance effect using stimuli that were more complex than the beeps and flashes used in previous studies. We controlled for the spatial location of the stimuli presented in the two sensory modalities, as opposed to using head- phones (Egeth & Sager, 1977), informed the participants clearly about the possible occurrence of bimodal trials, and provided a three-alternative response set that included one response key for each of the unimodal targets plus another response key for their combination (for bimodal targets).
If the Colavita visual dominance effect extends to the processing of complex stimuli, we would expect to see an imbalance in errors for bimodal targets, with the visual response being given significantly more often than the auditory response.
- Bald, L., Berrien, F. K., Price, J. B., & Sprague, R. O. (1942). Er- rors in perceiving the temporal order of auditory and visual stimuli. Journal of Applied Psychology, 26, 382-388.
- Basil, M. D. (1994). Multiple resource theory: I. Application to televi- sion viewing. Communication Research, 21, 177-207.
Battaglia, P. W., Jacobs, R. A., & Aslin, R. N. (2003). Bayesian inte- gration of visual and auditory signals for spatial localization. Journal of the Optical Society of America A, 20, 1391-1397.
- Broadbent, D. E. (1958). Perception and communication. Elmsford, NJ: Pergamon.
- Colavita, F. B. (1974). Human sensory dominance. Perception & Psy- chophysics, 16, 409-412.
- Colavita, F. B., Tomko, R., & Weisberg, D. (1976). Visual prepotency and eye orientation. Bulletin of the Psychonomic Society, 8, 25-26.
- Colavita, F. B., & Weisberg, D. (1979). A further investigation of visual dominance. Perception & Psychophysics, 25, 345-347.
- Cooper, R. (1998). Visual dominance and the control of action. In M. A. Gernsbacher & S. J. Derry (Eds.), Proceedings of the 20th Annual
- Conference of the Cognitive Science Society (pp. 250-255). Mahwah, NJ: Erlbaum.
- Desimone, R., & Duncan, J. (1995). Neural mechanisms of selective visual attention. Annual Review of Neuroscience, 18, 193-222.
- Driver, J., & Frackowiak, R. S. J. (2001). Neurobiological measures of human selective attention. Neuropsychologia, 39, 1257-1262.
- Egeth, H. E., & Sager, L. C. (1977). On the locus of visual dominance.Perception & Psychophysics, 22, 77-86.
- Eimer, M. (2000). The time course of spatial orienting elicited by central and peripheral cues: Evidence from event-related brain potentials. Biological Psychology, 53, 253-258.
- Ernst, M. O., & Banks, M. S. (2002). Humans integrate visual and haptic information in a statistically optimal fashion. Nature, 415, 429-433.
- Ernst, M. O., Banks, M. S., & Bülthoff, H. H. (2000). Touch can change visual slant perception. Nature Neuroscience, 3, 69-73.
- Ernst, M. O., & Bülthoff, H. H. (2004). Merging the senses into a robust percept. Trends in Cognitive Sciences, 8, 162-169.
- Foree, D. D., & LoLordo, V. M. (1973). Attention in the pigeon: Dif- ferential effects of food-getting versus shock-avoidance procedures. Journal of Comparative & Physiological Psychology, 85, 551-558.
- Frassinetti, F., Bolognini, N., & Làdavas, E. (2002). Enhancement of visual perception by crossmodal visuo-auditory interaction. Experi- mental Brain Research, 147, 332-343.
- Giard, M. H., & Peronnet, F. (1999). Auditory–visual integration dur- ing multimodal object recognition in humans: A behavioral and electro- physiological study. Journal of Cognitive Neuroscience, 11, 473-490.
- Gilbert, R. M. (1969). Discrimination learning? In R. M. Gilbert & N. S. Sutherland (Eds.), Animal discrimination learning (pp. 455- 489). New York: Academic Press.
- Hohnsbein, J., Falkenstein, M., Hoormann, J., & Blanke, L. (1991). Effects of crossmodal divided attention on late ERP compo- nents: I. Simple and choice reaction tasks. Electroencephalography & Clinical Neurophysiology, 78, 438-446.
- Intons-Peterson, M. J. (1983). Imagery paradigms: How vulnerable are they to experimenters’ expectations? Journal of Experimental Psy- chology: Human Perception & Performance, 9, 394-412.
- Kanwisher, N. (2001). Faces and places: Of central (and peripheral) interest. Nature Neuroscience, 4, 455-456.
- Klein, R. M. (1977). Attention and visual dominance: A chronometric analysis. Journal of Experimental Psychology: Human Perception & Performance, 3, 365-378.
- Kraemer, P. J., & Roberts, W. A. (1985). Short-term memory for simul- taneously presented visual and auditory signals in the pigeon. Journal of Experimental Psychology: Animal Behavior Processes, 11, 137-151.
- Kristofferson, A. B. (1967). Attention and psychophysical time. Acta Psychologica, 27, 93-100.
- Lachter, J., Forster, K. I., & Ruthruff, E. (2004). Forty-five years after Broadbent (1958): Still no identification without attention. Psy- chological Review, 111, 880-913.
- Lavie, N. (2005). Distracted and confused?: Selective attention under load. Trends in Cognitive Sciences, 9, 75-82.
- Lederman, S. J., & Abbott, S. G. (1981). Texture perception: Studies of intersensory organization using a discrepancy paradigm, and visual versus tactual psychophysics. Journal of Experimental Psychology: Human Perception & Performance, 7, 902-915.
- Macaluso, E., & Driver, J. (2001). Spatial attention and crossmodal in- teractions between vision and touch. Neuropsychologia, 39, 1304-1316. McDonald, J. J., Teder-Sälejärvi, W. A., & Hillyard, S. A. (2000). Involuntary orienting to sound improves visual perception. Nature,407, 906-908.
- Meltzer, D., & Masaki, M. A. (1973). Measures of stimulus control and stimulus dominance. Bulletin of the Psychonomic Society, 1, 28-30.
- Morein-Zamir, S., Soto-Faraco, S., & Kingstone, A. (2003). Auditory capture of vision: Examining temporal ventriloquism. Cognitive Brain Research, 17, 154-163.
- Orne, M. T. (1962). On the social psychology of the psychological experiment: With particular reference to demand characteristics and their implications. American Psychologist, 17, 776-783.
- Pallier, C., Dupoux, E., & Jeannin, X. (1997). EXPE: An expandable programming language for on-line psychological experiments. Behav- ior Research Methods, Instruments, & Computers, 29, 322-327.
- Partan, S., & Marler, P. (1999). Communication goes multimodal.Science, 283, 1272-1273.
- Pierce, A. H. (1908). The subconscious again. Journal of Philosophy, Psychology, & Scientific Methodology, 5, 264-271.
- Posner, M. I., Nissen, M. J., & Klein, R. M. (1976). Visual dominance: An information-processing account of its origins and significance. Psychological Review, 83, 157-171.
- Posner, M. I., Nissen, M. J., & Ogden, W. C. (1978). Attended and unattended processing modes: The role of set for spatial location. In
H. L. Pick, Jr. & E. Saltzman (Eds.), Modes of perceiving and process- ing information (pp. 137-157). Hillside, NJ: Erlbaum.
- Quinlan, P. T.. The “late” locus of visual dominance. Abstracts of the Psychonomic Society, 5, 64.
- Quinlan, P. T., & Bailey, P. J. (1995). An examination of attentional control in the auditory modality: Further evidence for auditory orient- ing. Perception & Psychophysics, 57, 614-628.
- Randich, A., Klein, R. M., & LoLordo, V. M. (1978). Visual domi- nance in the pigeon. Journal of the Experimental Analysis of Behav- ior, 30, 129-137.
- Rees, G., Frith, C. D., & Lavie, N. (2001). Processing of irrelevant visual motion during performance of an auditory task. Neuropsycho- logia, 39, 937-949.
- Rees, G., Russell, C., Frith, C. D., & Driver, J. (1999). Inattentional blindness versus inattentional amnesia for fixated but ignored words. Science, 286, 2504-2507.
- Rock, I., & Harris, C. S. (1967, May 17). Vision and touch. Scientific American, 216, 96-104.
- Rock, I., & Victor, J. (1964). Vision and touch: An experimentally cre- ated conflict between the two senses. Science, 143, 594-596.
- Rodway, P. (2005). The modality shift effect and the effectiveness of warn- ing signals in different modalities. Acta Psychologica, 120, 199-226.
- Sekuler, R., Sekuler, A. B., & Lau, R. (1997). Sound alters visual motion perception. Nature, 385, 308.
- Shams, L., Kamitani,Y., & Shimojo, S. (2000). Illusions: What you see is what you hear. Nature, 408, 788.
- Shapiro, K. L., Egerman, B., & Klein, R. M. (1984). Effects of arousal on human visual dominance. Perception & Psychophysics, 35, 547-552.
- Shapiro, K. L., Jacobs, W. J., & LoLordo, V. M. (1980). Stimulus- reinforcer interactions in Pavlovian conditioning of pigeons: Impli- cations for selective associations. Animal Learning & Behavior, 8, 586-594.
- Shapiro, K. L., & Johnson, T. L. (1987). Effects of arousal on attention to central and peripheral visual stimuli. Acta Psychologica, 66, 157-172.
- Sinnett, S., Costa, A., & Soto-Faraco, S. (2006). Manipulating inat- tentional blindness within and across sensory modalities. Quarterly Journal of Experimental Psychology, 59, 1425-1442.
- Snodgrass, J. G., & Vanderwart, M. (1980). A standardized set of 260 pictures: Norms for name agreement, image agreement, familiarity, and visual complexity. Journal of Experimental Psychology: Human Learning & Memory, 6, 174-215.
- Spence, C., Nicholls, M. E. R., & Driver, J. (2001). The cost of ex- pecting events in the wrong sensory modality. Perception & Psycho- physics, 63, 330-336.
- Spence, C., Shore, D. I., & Klein, R. M. (2001). Multisensory prior entry. Journal of Experimental Psychology: General, 130, 799-832.
- Spence, C., & Squire, S. (2003). Multisensory integration: Maintaining the perception of synchrony. Current Biology, 13, R519-R521.
- Titchener, E. B. (1908). Lectures on the elementary psychology of feel- ing and attention. New York: Macmillan.
- Turatto, M., Benso, F., Galfano, G., & Umiltà, C. (2002). Nonspa- tial attentional shifts between audition and vision. Journal of Experi- mental Psychology: Human Perception & Performance, 28, 628-639.
- Uetake, K., & Kudo, Y. (1994). Visual dominance over hearing in feed acquisition procedure of cattle. Applied Animal Behavior Science, 42,
- Welch, R. B., & Warren, D. H. (1980). Immediate perceptual response to intersensory discrepancy. Psychological Bulletin, 88, 638-667.
- Welch, R. B., & Warren, D. H. (1986). Intersensory interactions. In K. R. Boff, L. Kaufman, & J. P. Thomas (Eds.), Handbook of percep- tion and human performance: Vol. 1. Sensory processes and percep- tion (chap. 25, pp. 1-36). New York: Wiley.
- Whipple, G. M., Sanford, E. C., & Colegrove, F. W. (1899). On nearly simultaneous clicks and flashes: The time required for recogni- tion: Notes on mental standards of length. American Journal of Psy- chology, 10, 280-295.
More information: Paddy Ross et al. Children cannot ignore what they hear: Incongruent emotional information leads to an auditory dominance in children, Journal of Experimental Child Psychology (2021). DOI: 10.1016/j.jecp.2020.105068