The concept of implicit bias has made its way into the general consciousness, most often in the context of racial bias. More broadly, however, implicit biases can affect how people think of anything – from their thoughts about cookies to those about white men.
“All the little ways in which our everyday thinking about social stuff is unconscious or uncontrollable,” wrote Calvin Lai, assistant professor of psychology in Arts & Sciences at Washington University in St. Louis, in an article in DCist. “The stuff that we don’t realize is influencing us when we make decisions.”
Along with a broader cultural awareness of implicit bias is idea that the actions that they influence can be changed by eliminating the bias itself.
Change the bias, changes in the behavior will follow. It seems logical enough.
In a meta-analysis of research papers published on the subject of implicit bias, however, Lai found that the evidence does not show this kind of causal relationship.
The research is published in the Journal of Personality and Social Psychology.
Lai worked with Patrick Forscher, of the University of Arkansas, to systematically review 492 studies that dealt with changing people’s “automatic mental processes,” the uncontrollable, unconscious mental processes that have come to be known in particular contexts as “implicit bias.”
The studies contained more than 87,000 participants.
After crunching the numbers, Lai and Forscher saw that studies suggest biases can, in fact, be changed—although not dramatically.
When they homed in, looking at 63 studies that explicitly considered a link between changes in bias and changes in actions, however, they found no evidence of a causal relationship.
“We definitely didn’t expect this,” Lai said. “And it challenges assumptions about the relationship between implicit bias and behavior.”
Lai suggested four possible reasons that a link was not established in the meta-analysis:
- Measurement errors: The way outcomes were measured may have picked up on changes unrelated to the underlying bias. For example, Lai said, such a measurement would be analogous to “moving the mercury around within a thermometer rather than changing the heat in the room.”
- Confounds: After tests to measure an implicit bias, something happened, unrelated to the changed subjects’ behavior.
- Measured too narrow of a bias: Appeared to assess the same associations, but maybe the effects were too broad to capture a change associated with the change in bias. For example, the implicit bias measured was about broad attitudes toward White vs. Black people, but the behaviors measured were about behavior toward a specific person of a particular race. In that case, the attitude measured may have been too general.
- No causal relationship: Implicit bias doesn’t affect behavior at all.
This last option doesn’t sit well with Lai. “It would open a theoretical can of worms because there are decades of experiments in other lines of research showing evaluation without conscious intention or control,” he said.
However, Lai said there is a more effective way to change these behaviors; one that doesn’t rely on changing people’s implicit biases: ridding society of the features that cause people to act in a biased way.
For example, reducing subjectivity makes it more difficult for a person’s biases to affect decision-making.
Instead of relying on a “gut feeling” for a hiring decision, for example, lay out the requirements first, and stick to them.
Or, in the cookie realm, don’t have any on hand—not at home or at the office—and don’t drive past the bakery on the way home.
On an individual level, Lai said, “Equip people with strategies to resist the environment’s biasing influence.
“The power of counterstereotypes is not to be underestimated,” Lai wrote in a paper describing possible ways to counteract implicit biases. “And if counterstereotypical encounters become typical, shift in attitudes and beliefs will follow.”
Lai points out that this study was heavily constrained by the available literature. The studies they included focused on brief interventions and assessments and was heavily skewed toward a certain demographic: university students.
“At this time, we can’t distinguish between these potential explanations for the results,” Lai said. “Carefully controlled studies need to be conducted to rule out—or rule in—these explanations.”
Implicit Bias in Education
Research on implicit bias has identified several conditions in which individuals are most likely to rely on their unconscious System 1 associations.
These include situations that involve ambiguous or incomplete information; the presence of time constraints; and circumstances in which our cognitive control may be compromised, such as through fatigue or having a lot on our minds.7
Given that teachers encounter many, if not all, of these conditions through the course of a school day, it is unsurprising that implicit biases may be contributing to teachers’ actions and decisions.
Let’s consider a few examples in the context of school discipline.
First, classifying behavior as good or bad and then assigning a consequence is not a simple matter. All too often, behavior is in the eye of the beholder.
Many of the infractions for which students are disciplined have a subjective component, meaning that the situation is a bit ambiguous.
Thus, how an educator interprets a situation can affect whether the behavior merits discipline, and if so, to what extent.
Infractions such as “disruptive behavior,” “disrespect,” and “excessive noise,” for example, are ambiguous and dependent on context, yet they are frequently provided as reasons for student discipline.8
That is not to say that some form of discipline is unwarranted in these situations, or that all disciplinary circumstances are subjective, as certainly many have objective components. However, these subjective infractions constitute a very large portion of disciplinary incidents.
There are no standardized ways of assessing many infractions, such as disobedient or disruptive behavior, though schools do attempt to delineate some parameters through codes of conduct and by outlining associated consequences.
Yet subjectivity can still come into play.
Teachers’ experiences and automatic unconscious associations can shape their interpretation of situations that merit discipline, and can even contribute to discipline disparities based on a student’s race.
One study of discipline disparities9 found that students of color were more likely to be sent to the office and face other disciplinary measures for offenses such as disrespect or excessive noise, which are subjective, while white students were more likely to be sent to the office for objective infractions, such as smoking or vandalism.
(For more about discipline disparities, see “From Reaction to Prevention” by Russell J. Skiba and Daniel J. Losen.)
Thus, in disciplinary situations that are a bit ambiguous (What qualifies as disrespect?
How loud is too loud?), educators should be aware that their implicit associations may be contributing to their decisions without their conscious awareness or consent.
Second, implicit attitudes toward specific racial groups can unconsciously affect disciplinary decisions.
For example, extensive research has documented pervasive implicit associations that link African Americans, particularly males, to stereotypes such as aggression, criminality, or danger, even when explicit beliefs contradict these views.10
In education, these implicit associations can taint perceptions of the discipline severity required to ensure that the misbehaving student understands what he or she did wrong.
In short, these unconscious associations can mean the difference between one student receiving a warning for a confrontation and another student being sent to school security personnel.
In the words of researcher Carla R. Monroe, “Many teachers may not explicitly connect their disciplinary reactions to negative perceptions of Black males, yet systematic trends in disproportionality suggest that teachers may be implicitly guided by stereotypical perceptions that African American boys require greater control than their peers and are unlikely to respond to nonpunitive measures.”11
A recent study from Stanford University sheds further light on this dynamic by highlighting how racial disparities in discipline can occur even when black and white students behave similarly.12
In the experiment, researchers showed a racially diverse group of female K–12 teachers the school records of a fictitious middle school student who had misbehaved twice; both infractions were minor and unrelated.
Requesting that the teachers imagine working at this school, researchers asked a range of questions related to how teachers perceived and would respond to the student’s infractions.
While the student discipline scenarios were identical, researchers manipulated the fictitious student’s name; some teachers reviewed the record of a student given a stereotypically black name (e.g., Deshawn or Darnell) while others reviewed the record of a student with a stereotypically white name (e.g., Jake or Greg).
Results indicated that from the first infraction to the second, teachers were more likely to escalate the disciplinary response to the second infraction when the student was perceived to be black as opposed to white.
Moreover, a second part of the study, with a larger, more diverse sample that included both male and female teachers, found that infractions by a black student were more likely to be viewed as connected, meaning that the black student’s misbehavior was seen as more indicative of a pattern, than when the same two infractions were committed by a white student.13
Another way in which implicit bias can operate in education is through confirmation bias: the unconscious tendency to seek information that confirms our preexisting beliefs, even when evidence exists to the contrary.
The following example is from the context of employee performance evaluations, which explored this dynamic. Relevant parallels also exist for K–12 teachers evaluating their students’ work.
A 2014 study explored how confirmation bias can unconsciously taint the evaluation of work that employees produce.
Researchers created a fictitious legal memo that contained 22 different, deliberately planted errors.
These errors included minor spelling and grammatical errors, as well as factual, analytical, and technical writing errors. The exact same memo was distributed to law firm partners under the guise of a “writing analysis study,“14 and they were asked to edit and evaluate the memo.
Half of the memos listed the author as African American while the remaining portion listed the author as Caucasian.
Findings indicated that memo evaluations hinged on the perceived race of the author. When the author was listed as African American, the evaluators found more of the embedded errors and rated the memo as lower quality than those who believed the author was Caucasian.
Researchers concluded that these findings suggest unconscious confirmation bias; despite the intention to be unbiased, “we see more errors when we expect to see errors, and we see fewer errors when we do not expect to see errors.”15
While this study focused on the evaluation of a legal memo, it is not a stretch of the imagination to consider the activation of this implicit dynamic in grading student essays or evaluating other forms of subjective student performance.
Confirmation bias represents yet another way in which implicit biases can challenge the best of explicit intentions.
Finally, implicit biases can also shape teacher expectations of student achievement. For example, a 2010 study examined teachers’ implicit and explicit ethnic biases, finding that their implicit—not explicit—biases were responsible for different expectations of achievement for students from different ethnic backgrounds.16
While these examples are a select few among many, together they provide a glimpse into how implicit biases can have detrimental effects for students, regardless of teachers’ explicit goals. This raises the question: How can we better align our implicit biases with the explicit values we uphold?
Mitigating the Influence of Implicit Bias
Recognizing that implicit biases can yield inequitable outcomes even among well-intentioned individuals, a significant portion of implicit bias research has explored how individuals can change their implicit associations – in effect “reprogramming” their mental associations so that unconscious biases better align with explicit convictions.
Thanks to the malleable nature of our brains, researchers have identified a few approaches that, often with time and repetition, can help inhibit preexisting implicit biases in favor of more egalitarian alternatives.
With implicit biases operating outside of our conscious awareness and inaccessible through introspection, at first glance it might seem difficult to identify any that we may hold. Fortunately, researchers have identified several approaches for assessing these unconscious associations, one of which is the Implicit Association Test (IAT).
Debuting in 1998, this free online test measures the relative strength of associations between pairs of concepts. Designed to tap into unconscious System 1 associations, the IAT is a response latency (i.e., reaction time) measure that assesses implicit associations through this key idea: when two concepts are highly associated, test takers will be faster at pairing those concepts (and make fewer mistakes doing so) than they will when two concepts are not as highly associated.*
To illustrate, consider this example. Most people find the task of pairing flower types (e.g., orchid, daffodil, tulip) with positive words (e.g., pleasure, happy, cheer) easier than they do pairing flower types with negative words (e.g., rotten, ugly, filth).
Because flowers typically have a positive connotation, people can quickly link flowers to positive terms and make few mistakes in doing so. In contrast, words such as types of insects (e.g., ants, cockroaches, mosquitoes) are likely to be easier for most people to pair with those negative terms than with positive ones.17
While this example is admittedly simplistic, these ideas laid the foundation for versions of the IAT that assess more complex social issues, such as race, gender, age, and sexual orientation, among others. Millions of people have taken the IAT, and extensive research has largely upheld the IAT as a valid and reliable measure of implicit associations.18
There are IATs that assess both attitudes (i.e., positive or negative emotions toward various groups) and stereotypes (i.e., how quickly someone can connect a group to relevant stereotypes about that group at an implicit level).
Educators can begin to address their implicit biases by taking the Implicit Association Test. Doing so will enable them to become consciously aware of some of the unconscious associations they may harbor.
Research suggests that this conscious awareness of one’s own implicit biases is a critical first step for counteracting their influence.19 This awareness is especially crucial for educators to help ensure that their explicit intentions to help students learn and reach their full potential are not unintentionally thwarted by implicit biases.
By identifying any discrepancies that may exist between conscious ideals and automatic implicit associations, individuals can take steps to bring those two into better alignment. One approach for changing implicit associations identified by researchers is intergroup contact: meaningfully engaging with individuals whose identities (e.g., race, ethnicity, religion) differ from your own.
Certain conditions exist for optimal effects, such as equal status within the situation, a cooperative setting, and working toward common goals.20 By getting to know people who differ from you on a real, personal level, you can begin to build new associations about the groups those individuals represent and break down existing implicit associations.21
Another approach that research has determined may help change implicit associations is exposure to counter-stereotypical exemplars: individuals who contradict widely held stereotypes. Some studies have shown that exposure to these exemplars may help individuals begin to automatically override their preexisting biases.22
Examples of counter-stereotypical exemplars may include male nurses, female scientists, African American judges, and others who defy stereotypes.
This approach for challenging biases is valuable not just for educators but also for the students they teach, as some scholars suggest that photographs and décor that expose individuals to counter-stereotypical exemplars can activate new mental associations.23
While implicit associations may not change immediately, using counter-stereotypical images for classroom posters and other visuals may serve this purpose.
Beyond changing cognitive associations, another strategy for mitigating implicit biases that relates directly to school discipline is data collection.
Because implicit biases function outside of conscious awareness, identifying their influence can be challenging. Gathering meaningful data can bring to light trends and patterns in disparate treatment of individuals and throughout an institution that may otherwise go unnoticed.
In the context of school discipline, relevant data may include the student’s grade, the perceived infraction, the time of day it occurred, the name(s) of referring staff, and other relevant details and objective information related to the resulting disciplinary consequence.
Information like this can facilitate a large-scale review of discipline measures and patterns and whether any connections to implicit biases may emerge.24Moreover, tracking discipline data over time and keeping implicit bias in mind can help create a school- or districtwide culture of accountability.
Finally, in the classroom, educators taking enough time to carefully process a situation before making a decision can minimize implicit bias. Doing so, of course, is easier said than done, given that educators are constantly pressed for time, face myriad challenges, and need crucial support from administrators to effectively manage student behavior.
As noted earlier, System 1 unconscious associations operate extremely quickly.
As a result, in circumstances where individuals face time constraints or have a lot on their minds, their brains tend to rely on those fast and automatic implicit associations.
Research suggests that reducing cognitive load and allowing more time to process information can lead to less biased decision making.25
In terms of school discipline, this can mean allowing educators time to reflect on the disciplinary situation at hand rather than make a hasty decision.26
While implicit biases can affect any moment of decision making, these unconscious associations should not be regarded as character flaws or other indicators of whether someone is a “good person” or not. Having the ability to use our System 1 cognition to make effortless, lightning-fast associations, such as knowing that a green traffic light means go, is crucial to our cognition.
Rather, when we identify and reflect on the implicit biases we hold, we recognize that our life experiences may unconsciously shape our perceptions of others in ways that we may or may not consciously desire, and if the latter, we can take action to mitigate the influence of those associations.
In light of the compelling body of implicit bias scholarship, teachers, administrators, and even policymakers are increasingly considering the role of unconscious bias in disciplinary situations. For example, the federal school discipline guidance jointly released by the U.S. departments of Education and Justice in January 2014 not only mentions implicit bias as a factor that may affect the administration of school discipline, it also encourages school personnel to receive implicit bias training.
(For more information on that guidance, see “School Discipline and Federal Guidance.”) Speaking not only to the importance of identifying implicit bias but also to mitigating its effects, the federal guidance asserts that this training can “enhance staff awareness of their implicit or unconscious biases and the harms associated with using or failing to counter racial and ethnic stereotypes.”27
Of course, teachers who voluntarily choose to pursue this training and explore this issue on their own can also generate interest among their colleagues, leading to more conversations and awareness.
Accumulated research evidence indicates that implicit bias powerfully explains the persistence of many societal inequities, not just in education but also in other domains, such as criminal justice, healthcare, and employment.28
While the notion of being biased is one that few individuals are eager to embrace, extensive social science and neuroscience research has connected individuals’ System 1 unconscious associations to disparate outcomes, even among individuals who staunchly profess egalitarian intentions.
In education, the real-life implications of implicit biases can create invisible barriers to opportunity and achievement for some students – a stark contrast to the values and intentions of educators and administrators who dedicate their professional lives to their students’ success.
Thus, it is critical for educators to identify any discrepancies that may exist between their conscious ideals and unconscious associations so that they can mitigate the effects of those implicit biases, thereby improving student outcomes and allowing students to reach their full potential.
7. Marianne Bertrand, Dolly Chugh, and Sendhil Mullainathan, “Implicit Discrimination,” American Economic Review 95, no. 2 (2005): 94–98.
8. See, for example, Cheryl Staats and Danya Contractor, Race and Discipline in Ohio Schools: What the Data Say (Columbus, OH: Kirwan Institute for the Study of Race and Ethnicity, 2014).
9. Russell J. Skiba, Robert S. Michael, Abra Carroll Nardo, and Reece L. Paterson, “The Color of Discipline: Sources of Racial and Gender Disproportionality in School Punishment,” Urban Review 34 (2002): 317–342.
10. Jennifer L. Eberhardt, Phillip Atiba Goff, Valerie J. Purdie, and Paul G. Davies, “Seeing Black: Race, Crime, and Visual Processing,” Journal of Personality and Social Psychology 87 (2004): 876–893.
11. Carla R. Monroe, “Why Are ‘Bad Boys’ Always Black? Causes of Disproportionality in School Discipline and Recommendations for Change,” The Clearing House: A Journal of Educational Strategies, Issues and Ideas 79 (2005): 46.
12. Jason A. Okonofua and Jennifer L. Eberhardt, “Two Strikes: Race and the Disciplining of Young Students,” Psychological Science 26 (2015): 617–624.
13. Okonofua and Eberhardt, “Two Strikes.”
14. Arin N. Reeves, Written in Black & White: Exploring Confirmation Bias in Racialized Perceptions of Writing Skills(Chicago: Nextions, 2014).
15. Reeves, Written in Black & White, 6.
16. Linda van den Bergh, Eddie Denessen, Lisette Hornstra, Marinus Voeten, and Rob W. Holland, “The Implicit Prejudiced Attitudes of Teachers: Relations to Teacher Expectations and the Ethnic Achievement Gap,” American Educational Research Journal 47 (2010): 497–527.
17. This example is from Anthony G. Greenwald, Debbie E. McGhee, and Jordan L. K. Schwartz, “Measuring Individual Differences in Implicit Cognition: The Implicit Association Test,” Journal of Personality and Social Psychology 74 (1998): 1464–1480.
18. Brian A. Nosek, Anthony G. Greenwald, and Mahzarin R. Banaji, “The Implicit Association Test at Age 7: A Methodological and Conceptual Review,” in Social Psychology and the Unconscious: The Automaticity of Higher Mental Processes, ed. John A. Bargh (New York: Psychology Press, 2007), 265–292.
19. Patricia G. Devine, Patrick S. Forscher, Anthony J. Austin, and William T. L. Cox, “Long-Term Reduction in Implicit Bias: A Prejudice Habit-Breaking Intervention,” Journal of Experimental Social Psychology 48 (2012): 1267–1278; and John F. Dovidio, Kerry Kawakami, Craig Johnson, Brenda Johnson, and Adaiah Howard, “On the Nature of Prejudice: Automatic and Controlled Processes,” Journal of Experimental Social Psychology 33 (1997): 510–540.
20. Gordon W. Allport, The Nature of Prejudice (Cambridge, MA: Addison-Wesley, 1954). Allport also recognizes a fourth condition for optimal intergroup contact, which is authority sanctioning the contact.
21. Thomas F. Pettigrew and Linda R. Tropp, “A Meta-Analytic Test of Intergroup Contact Theory,” Journal of Personality and Social Psychology 90 (2006): 751–783.
22. Nilanjana Dasgupta and Anthony G. Greenwald, “On the Malleability of Automatic Attitudes: Combating Automatic Prejudice with Images of Admired and Disliked Individuals,”Journal of Personality and Social Psychology 81 (2001): 800–814; and Nilanjana Dasgupta and Shaki Asgari, “Seeing Is Believing: Exposure to Counterstereotypic Women Leaders and Its Effect on the Malleability of Automatic Gender Stereotyping,” Journal of Experimental Social Psychology 40 (2004): 642–658.
23. Jerry Kang, Mark Bennett, Devon Carbado, et al., “Implicit Bias in the Courtroom,” UCLA Law Review 59 (2012): 1124–1186.
24. Kent McIntosh, Erik J. Girvan, Robert H. Horner, and Keith Smolkowski, “Education Not Incarceration: A Conceptual Model for Reducing Racial and Ethnic Disproportionality in School Discipline,” Journal of Applied Research on Children: Informing Policy for Children at Risk 5, no. 2 (2014): art. 4.
25. Diana J. Burgess, “Are Providers More Likely to Contribute to Healthcare Disparities under High Levels of Cognitive Load? How Features of the Healthcare Setting May Lead to Biases in Medical Decision Making,” Medical Decision Making 30 (2010): 246–257.
26. Prudence Carter, Russell Skiba, Mariella Arredondo, and Mica Pollock, You Can’t Fix What You Don’t Look At: Acknowledging Race in Addressing Racial Discipline Disparities, Disciplinary Disparities Briefing Paper Series (Bloomington, IN: Equity Project at Indiana University, 2014).
27. U.S. Department of Education, Guiding Principles: A Resource Guide for Improving School Climate and Discipline(Washington, DC: Department of Education, 2014), 17.
28. For more on implicit bias and its effects in various professions, see the Kirwan Institute’s annual State of the Science: Implicit Bias Review (link is external)publication.
More information: Forscher, P. S., et al. (2019). A meta-analysis of procedures to change implicit measures. Journal of Personality and Social Psychology. Advance online publication. dx.doi.org/10.1037/pspa0000160
Lai, Calvin K., and Mahzarin R. Banaji. 2019. “The Psychology of Implicit Intergroup Bias and the Prospect of Change.” PsyArXiv. April 16. DOI: 10.31234/osf.io/bv2pq
Journal information: Journal of Personality and Social Psychology
Provided by Washington University in St. Louis