For decades, research has shown that our perception of the world is influenced by our expectations.
These expectations also called “prior beliefs,” help us make sense of what we are perceiving in the present, based on similar past experiences.
Consider, for instance, how a shadow on a patient’s X-ray image, easily missed by a less experienced intern, jumps out at a seasoned physician.
The physician’s prior experience helps her arrive at the most probable interpretation of a weak signal.
The process of combining prior knowledge with uncertain evidence is known as Bayesian integration and is believed to widely impact our perceptions, thoughts, and actions.
Now, MIT neuroscientists have discovered distinctive brain signals that encode these prior beliefs.
They have also found how the brain uses these signals to make judicious decisions in the face of uncertainty.
“How these beliefs come to influence brain activity and bias our perceptions was the question we wanted to answer,” says Mehrdad Jazayeri, the Robert A. Swanson Career Development Professor of Life Sciences, a member of MIT’s McGovern Institute for Brain Research, and the senior author of the study.
The researchers trained animals to perform a timing task in which they had to reproduce different time intervals.
Performing this task is challenging because our sense of time is imperfect and can go too fast or too slow.
However, when intervals are consistently within a fixed range, the best strategy is to bias responses toward the middle of the range.
This is exactly what animals did.
Moreover, recording from neurons in the frontal cortex revealed a simple mechanism for Bayesian integration:
Prior experience warped the representation of time in the brain so that patterns of neural activity associated with different intervals were biased toward those that were within the expected range.
MIT postdoc Hansem Sohn, former postdoc Devika Narain, and graduate student Nicolas Meirhaeghe are the lead authors of the study, which appears in the July 15 issue of Neuron.
Ready, set, go
Statisticians have known for centuries that Bayesian integration is the optimal strategy for handling uncertain information.
When we are uncertain about something, we automatically rely on our prior experiences to optimize behavior.
“If you can’t quite tell what something is, but from your prior experience you have some expectation of what it ought to be, then you will use that information to guide your judgment,” Jazayeri says. “We do this all the time.”
In this new study, Jazayeri and his team wanted to understand how the brain encodes prior beliefs, and put those beliefs to use in the control of behavior.
To that end, the researchers trained animals to reproduce a time interval, using a task called “ready-set-go.”
In this task, animals measure the time between two flashes of light (“ready” and “set”) and then generate a “go” signal by making a delayed response after the same amount of time has elapsed.
They trained the animals to perform this task in two contexts.
In the “Short” scenario, intervals varied between 480 and 800 milliseconds, and in the “Long” context, intervals were between 800 and 1,200 milliseconds.
At the beginning of the task, the animals were given the information about the context (via a visual cue), and therefore knew to expect intervals from either the shorter or longer range.
Jazayeri had previously shown that humans performing this task tend to bias their responses toward the middle of the range.
Here, they found that animals do the same. For example, if animals believed the interval would be short, and were given an interval of 800 milliseconds, the interval they produced was a little shorter than 800 milliseconds.
Conversely, if they believed it would be longer, and were given the same 800-millisecond interval, they produced an interval a bit longer than 800 milliseconds.
“Trials that were identical in almost every possible way, except the animal’s belief led to different behaviors,” Jazayeri says.
“That was compelling experimental evidence that the animal is relying on its own belief.”
Once they had established that the animals relied on their prior beliefs, the researchers set out to find how the brain encodes prior beliefs to guide behavior.
They recorded activity from about 1,400 neurons in a region of the frontal cortex, which they have previously shown is involved in timing.
During the “ready-set” epoch, the activity profile of each neuron evolved in its own way, and about 60 percent of the neurons had different activity patterns depending on the context (Short versus Long).
To make sense of these signals, the researchers analyzed the evolution of neural activity across the entire population over time and found that prior beliefs bias behavioral responses by warping the neural representation of time toward the middle of the expected range.
“We have never seen such a concrete example of how the brain uses prior experience to modify the neural dynamics by which it generates sequences of neural activities, to correct for its own imprecision
This is the unique strength of this paper: bringing together perception, neural dynamics, and Bayesian computation into a coherent framework, supported by both theory and measurements of behavior and neural activities,” says Mate Lengyel, a professor of computational neuroscience at Cambridge University, who was not involved in the study.
Researchers believe that prior experiences change the strength of connections between neurons.
The strength of these connections, also known as synapses, determines how neurons act upon one another and constrains the patterns of activity that a network of interconnected neurons can generate.
The finding that prior experiences warp the patterns of neural activity provides a window onto how experience alters synaptic connections.
“The brain seems to embed prior experiences into synaptic connections so that patterns of brain activity are appropriately biased,” Jazayeri says.
MIT neuroscientists have identified patterns of brain activity that underlie our ability to interpret sensory input based on our expectations and past experiences.
The image is credited to Christine Daniloff, MIT.
As an independent test of these ideas, the researchers developed a computer model consisting of a network of neurons that could perform the same ready-set-go task.
Using techniques borrowed from machine learning, they were able to modify the synaptic connections and create a model that behaved like the animals.
These models are extremely valuable as they provide a substrate for the detailed analysis of the underlying mechanisms, a procedure that is known as “reverse-engineering.” Remarkably, reverse-engineering the model revealed that it solved the task the same way the monkeys’ brain did.
The model also had a warped representation of time according to prior experience.
The researchers used the computer model to further dissect the underlying mechanisms using perturbation experiments that are currently impossible to do in the brain.
Using this approach, they were able to show that unwarping the neural representations removes the bias in the behavior.
This important finding validated the critical role of warping in Bayesian integration of prior knowledge.
The researchers now plan to study how the brain builds up and slowly fine-tunes the synaptic connections that encode prior beliefs as an animal is learning to perform the timing task.
Funding: The research was funded by the Center for Sensorimotor Neural Engineering, the Netherlands Scientific Organization, the Marie Sklodowska Curie Reintegration Grant, the National Institutes of Health, the Sloan Foundation, the Klingenstein Foundation, the Simons Foundation, the McKnight Foundation, and the McGovern Institute.
How does the brain use mental concepts to modulate and determine what we think we see?
While, it is not yet clear exactly how this occurs, there is increasing research that helps. One important fact is that the brain works in some ways like television transmission, in that it processes stable backgrounds without much attention and moving parts more intensely and differently. Expectations come from the context of the scene, previous experiences and mostly what has happened recently and repetitively. Many brain mechanisms evaluate which of a variety of probable events are actually occurring. When a decision is made as to which expected event is chosen, then the brain will “see” that particular image. But, how does expectation affect perception in the visual system?
Unexpected events have a particularly important role to play and have their own elaborate pathways. The unexpected and the successful expectation pathways interact and can suppress each other. Sometimes errors of expectation occur and sometimes, a completely new event occurs that has to be analyzed, synthesized and compared to our repertoire of past experiences. Recent research in babies shows that they respond most to unexpected events and use these to evaluate the environment and learn. In fact they will perform experiments with objects to learn about their properties. The unexpected is critical in the adult brain as well.
All types of sensory signals barrage the brain simultaneously. The human brain is specially geared for visual signals using half of the volume for its analysis. It is well known that there are far more neuronal signals coming down (top-down) from the higher cortex than incoming (bottom-up) from sensory signals. These top-downsignals modulate and massage the incoming signals, ultimately determining what is gleaned from the sensory information.
Two different brain processes are making predictions and taking in information over time to make a decision. It is not a simple linear progression of information being analyzed. Rather it is a complex series of loops and feedback that involve a large amount of top-downsignals—expectations and possibilities—meeting and modulating bottom-up sensory information with back and forth communication. The brain signals are both wired neuronal connections and wireless synchronous oscillations between regions.
When there are competing expectations, a longer process is triggered, with specific neuronal activity. When looking at a stable scene, brain activity occurs in multiple regions related to different possibilities and probabilities. As the decision making process continues, other parts of the brain become involved.
There is no real accepted definition of words “consciousness” and “attention.” For this discussion, attention means focusing on one particular part of the scene. Later, the complex question arises as to the relationship of “expectation” and “attention”. Attention adds meaning (salience) to the expectation that is being considered to form the perception.
Expectation and Perception
Perceptions are quite different for various individuals.
Color is manipulated by the brain. Animals need to know whether a particular food is dangerous or not (is it a tomato or not) and, therefore, the brain constantly adjusts color balance, just like the program Photoshop.
It determines that the perception of the color of an object doesn’t change even if the ambient light source changes.
Another filter of the sensory information occurs before the data reaches the cortex.
This emotional filter responds very rapidly to perceived threats, such as someone about to throw a punch.
The flinch occurs so rapidly there is no time for cortex analysis. These circuits are sent very rapidly to the much closer emotional centers for a rapid reflex.
But, the over arching analysis of visual signals depends on what is expected. A famous experiment has been described where people are told to concentrate on the activity of a group of people on a stage, such as passing a basketball.
Then, a man in a gorilla suit walks across the stage. Surprisingly, most people do not “see” this man and don’t remember that it happened.
This is because the visual system is focused on a task and didn’t expect a gorilla.
Top-Down Influence of the Cortex on Perception
Recent research shows that there are a large number of neurons that come down from the cortex analyzing, synthesizing and altering the incoming sensory signals. T
he top down neurons from the cortex far outnumber the incoming sensory neurons.
Therefore, the influence of the brain and expectation are far greater than the raw data.
There are many rarely noticed examples in daily life of top down effects on perceptions:
- Picture of bright light causes eye pupils to react, as if a real light.
- Thinking of good deeds makes the room brighter.
- Thinking of bad deed room appears darker.
- If hungry, words related to food appear brighter.
- Good hitters in baseball view the ball as larger.
- Poor children view coins as larger.
- Breathing alters perceptions of emotions and different physical activities—changing fear to relaxation.
- Hunters holding a gun more likely to think others are holding a gun even when they aren’t.
- Large people judge the absolute measurement of a doorway as more narrow than others will.
Words and thoughts alter sensory information:
- From ConnyShe kicked the ball” or “grasped the subject” stimulates the leg or arm brain regions related to kicking or grasping.
- “Wet behind the ears” triggers brain regions of the sense of wetness and the ears, as well as the language centers.
- Experienced observers of ballet or classical Indian dance who have never danced, when watching a dance stimulate specific muscles of the dance.
- Imagining a scene before looking at it alters the perception. The same occurs when feeling an emotion before looking at the scene alters the perception.
For the brain to make decisions about what will occur next, it forms a bias.
This is based on past experience, but especially very recent events and those that are repeated. When just having seen a friend multiple times, if there is a knock on the door, it is assumed to be the friend.
It is very difficult to study how the brain accomplishes this.
Most studies of the visual system isolate specific stimuli, whereas in life, the visual scene is very complex.
Analysis of context allows expectation to form.
The brain will focus on known elements such as trees or buildings to form an impression of what should come next.
Movement is especially important in forming expectations.
A great surprise, however, such as a desk in the garden, might not even be “seen”.
In the brain, the unexpected produces slowing and errors, and often is suppressed.
Despite experiments trying to isolate occurrence and change of objects, in real life there are many objects that are common in the scenes of every day life.
Hearing a bark usually means we will soon see a dog, not a gorilla. Children learn how to expect certain things from sounds and sights.
The brain uses the frequent objects, the probability of other objects and the recent scene before searching for answers somewhere else.
We assume certain colors will maintain throughout a floor or wall. Context will fill in dark areas we don’t fully understand, but not necessarily correctly.
And anyone seen multiple times will be suspected for dark corners.
One type of research studies the relative likelihood of visual evidence from a given stimuli.
Another tries to analyze information as it comes in over time, adding each new view to the decision making until there is a threshold of a decision.
Research has been consistent that when observing a random dot, a build up in the motor cortex areas occurs over time.
Each alternative is analyzed and given some weight and probability in different brain regions.
There are different kinds of errors for fast information and slow information.
If a definite clue is given in the research trials, then there is a bias in that particular direction even before a stimulus is given.
During a visual learning task, regions of the brain are active before the next stimuli, both where the object is identified and in the entire scene where movement is picked up.
This brain activity occurs before the actual scene is given to the subject as a perception.
If cues are given that are usually followed by a visual event, there is a buildup of blood flow in the brain region for the expected object to appear or move.
If an image is shown that gradually morphs into a face, the region of the brain for faces (fusiform gyrus) is signaling long before it becomes a face.
There are many regions that alternate between activity and inactivity during the formation of the concept of what is being seen.
When a visual image is missing from a sequence, then the brain region of that missing image is active.
In illusions where the visual scene alters between a vase and a face, these brain regions are alternating high and low activity.
Prior Information and the Unexpected
When a particular object is expected, it dampens the blood flow in many regions of competing possibilities.
When a sudden unexpected event occurs, it requires a rapid recalculation and decision.
These sudden unusual sights rule out previous expectations of the most recent regular scenes.
Neurons that pick up the unexpected signal have the greatest effect on changing the expectation.
If instead of the face, the image being gradually formed in the experiment is a building, there are strong signals damping the face center, overcoming the bias.
When unusual faces were presented, face centers suppress the expectation.
This alters the neural signals both before and during the act of seeing.
As the signal is processed, there is more elaborate analysis of the signal.
This happens especially when there are many possible different interpretations detected.
Different possibilities are presented at different levels in the analysis hierarchy.
Higher brain regions have more complex answers.
The probabilities of different options change over time. These different possibilities are presented and honed over and over and finally one is chosen as the proper interpretation.
Errors Comparing Predictions with Data
The visual cortex not only sees, but times its responses.
Updating of the expected and observed information occurs over time.
The repeated refinement eventually picks the winner.
But, this choice can be in error. When error signals accumulate it triggers a rapid recalculation. Silencing of a particular brain region occurs by retrograde neural signals.
This resets the expectation and the level of expectation suppression.
When expectation of an object doesn’t pan out, then specific regions are inhibited and others are increased.
When a higher level is suppressed, it disrupts the lower processes. When these regions are deactivated by tran-scranial magnetic stimulation, the prediction doesn’t work.
Two separate pathways occur for predictions that are successful, and those that are errors with unexpected events.
Both have different top-down and bottom-up neurons.
These pathways use multiple neuron types and connections and, also, brain waves to form specific different circuits.
One region is the medial temporal lobe, which is active while the perception decisions are made. But, the striatum is, also, critical. The orbital frontal is involved in decisions, and the para-hippocampus is involved in context. In fact, decisions involve massive brain activity throughout the brain in millisecond activity that is not currently measurable.
Most energy goes to calculating the most probable expectations. The regions that are active for the expectations can have increased activity, while the systems related to unexpected events can be suppressed.
Expectations actually reduce the amount of general activity, but specific regions are heightened. They inhibit competing expectation regions and they repress any new information and therefore, increase only the activity of the expectation.
Attention and Expectation
Selective attention occurs when the brain finds some parts of a scene more relevant or important and these are selected for special processing.
Expectation often is the reason for the selective attention. Looking for a lost object (a missing car key) the brain focuses on the usual places for it. It is difficult to think of looking in the unusual locations where the lost object is hiding in plain sight.
Along with the impossible to define word “consciousness”, attention is, also, not definable, except in context.
Therefore, the specific brain regions involved in each are not really defined. Definitions of “attention” and “consciousness” and “expectation” are logical definitions in a system of thought, rather than an inherent biological fact.
For this discussion, expectation focuses on those aspects of the scene likely to be present.
Selective attention further focuses the brain’s search to understand the scene by choosing those sensory inputs that are deemed to be more important.
Some research teases out those modulations that occur on the probably of specific sensory input occurring versus the importance that the observer places on them.
When data is vague, then probability operates more.
When data is very specific and strong, then the fact that something is meaningful increases the decision-making.
Attention naturally ignores very unexpected or extremely vague information.
But, also, attention makes the prediction occur more rapidly.
Attention and expectation interact in many ways.
Expectations can be about specific parts of the scene and attention decides how important it is.
Errors occur in predictions and these are constantly being altered and updated.
Attention can alter the amount of errors. Attention affects whether there is more needed information.
Expectations and their probabilities are more important for determination of the response.
Attention makes more details apparent. Both can increase and alter the perceptual decision.
In one study expectation limited sensory information and reactions when there was little attention.
With attention, the opposite occurs and there is more data and reaction.
Attention changes the ability of the top-down neurons from suppressing expectations by focusing on errors.
Attention makes errors more available for analysis. Research shows that attention strongly affects errors in prediction.
Repetition of a scene shows movement and change and is highly related to expectation and attention.
The default of repetition is stability, not change.
When a neuron fires repetitively, the sensory signal decreases.
This could be nerve fatigue. Or it could be the suppression of expectation.
When the repetitions are more expected, the signals are greater. Repetition supression begins at 50 ms and expectation suppression at 100 ms.
How Does Expectation Affect Perception
The brain gives us a continuous panorama, accompanied by a sound track, by piecing together millions of bits of information in a very narrow range of available light waves and sound waves.
The brain never really sees or hears anything.
It just responds to and interprets electrical signals.
It makes us believe we are seeing what is “really out there” and creates in us an assumption that everyone else sees exactly the same thing.
Meaning and emotions are attached to specific electrical and sensory signals.
The brain has many interacting pathways and loops that create expectations with different probabilities from our previous experiences.
The context and movement in the scene stimulate the brain to prioritize possible future events. It gathers information and then at some point guesses on what is happening and creates the visual scene for our perception.
This can be correct or an error. The unexpected event is more difficult to process, because of the bias toward what has occurred recently and repetitively.
These competing loops can have many different effects.
The complexity of perception formation is more evidence of the way the brain operates as a whole rather than in modules.
Along with the vast array of neuroplasticity mechanisms that occur simultaneously all over the brain for learning, isn’t this perception process best viewed as the brain interacting with mind?
Anne Trafton – MIT
The image is credited to Christine Daniloff, MIT.
Original Research: Closed access
“Bayesian Computation through Cortical Latent Dynamics”. Hansem Sohn, Devika Narain, Nicolas Meirhaeghe, Mehrdad Jazayeri.