Language and social identity have been making headlines recently. Last month, Air Canada’s CEO Michael Rousseau faced scrutiny over not knowing French — his language deficit is helping support Bill 96 in Québec (which seeks to change the Canadian Constitution to affirm Québec as a nation and French its official language).
Meanwhile Indian chain store Fabindia had to change advertisements for its festive Diwali clothing line from its Urdu name to appease Hindu nationalist politicians.
Language can evoke a strong social and emotional response. But the dominant theory of language in linguistics, thanks to Noam Chomsky (and the one in which I was trained), fails to consider these aspects.
In linguistics, and in cognitive science in general, the human mind is conceived of metaphorically as a computer with different algorithms for different procedures — with no reference to emotion or social context.
A better understanding of language and its neuroscientific basis would help us handle linguistic issues throughout our lives.
My new research underlines how emotional context affects how we understand and use language at the neural level. It also identifies a piece of the human language puzzle that has, up until now, been missing.
What is human language
The components of this puzzle are hard to define because the big picture, “language,” is difficult to specify.
When I ask students at start of term, “What is human language, anyway?” they typically fall silent. So, we start the discussion by separating out communicative systems (like plants and bees, which communicate but do not have language); whether language has to be auditory (no, think about sign language); and the difference between dialect and language.
We then discuss sentences like “Colourless green ideas sleep furiously” to show that human language is governed by a grammatical system — a sentence can be grammatical without meaning. Finally, another big question: Why do we have language?
Other mammals have sophisticated communicative systems (chimps, elephants, whales) but cannot generate an infinite number of sentences. For example, Koko the gorilla could not say, “Tomorrow, I might eat one or two bananas.”
Why not? Seemingly, it would be due to the structure of her brain compared to ours.
Neuroscientist Suzana Herculano-Houzel has pointed out that our brains are different because of the number of neurons packed into our skulls — it’s less about the size of our brains. The density of that packing, and the ensuing neuronal connections this density allows for, gives rise to our ability to acquire language from birth and use it till death.
But let’s leave aside the neuroanatomical differences between our brains and those of gorillas for others to solve. That still doesn’t help us resolve the issue of defining language and its essential components.
In contrast to my Chomskyan training, recent results from my lab show that social identity is not, in fact, a supplemental feature of language, but a feature that is part of every level of linguistic knowledge and use.
This seems highly counterintuitive, especially given that the first formal grammar, Ashtadhyayi (circa 550 BCE), by Sanskrit grammarian Panini established the idea that language is a system of abstract rules, where these grammatical rules make no reference to emotion or social context.
In contrast to this age-old idea, my recent work using EEG technology — which measures brain wave activity — has shown that the affective state of a person (how someone feels) while they read non-emotional sentences in English changes the nature of the brain response.
I was stunned by these results. What does it mean if basic sentence comprehension is tied to emotion?
Just the superficial gist
Psychologist Lisa Feldman Barrett paves the way to understanding these findings.
She assumes that the main function of the brain is to regulate our bodies as we move through life. That means that at every given moment, our brains assess our hunger, threat levels, etc. to figure out how much energy we need to get through the day. Thinking and cognitive perception are secondary products of how our brain responds predictively to our environment.
If she’s right (and I think she is), I would say that linguistic function, which must include a grammatical system, can also be understood as an “add-on” feature of the brain.
If the context of a comment requires deep attention to meaning (due to difficult sentences), then our grammatical system can become engaged. Otherwise, it is likely that many people interpret only word meaning to get the superficial gist of a sentence, then moving on to the next.
This is comparable to psychologist Daniel Kahneman’s take on how the mind works, so perhaps it’s not surprising that these general principles also work for language.
If the grammatical system is a resource that the brain uses depending on context, then our emotions and identity can also affect how we use grammar. This is precisely what we have found.
Funding: Veena D. Dwivedi receives funding from the Canada Foundation for Innovation, the Social Sciences and Humanities Research Council and Brock University.
Common sense suggests that language has naught to do with emotion. Surely, the things that people say affect our emotions, and we can describe our emotions (or the emotions we see in others) with words after the fact. However, it is typically assumed that this is the extent of the relationship between language and emotion.
Many contemporary psychological models of emotion agree with this common sense perspective. In these views, emotions are physical types that are essentially distinct from linguistic or conceptual processing (Ekman and Cordaro, 2011; Panksepp, 2011; Shariff and Tracy, 2011; Fontaine et al., 2013). Yet growing psychological research suggests that the role of language may run deeper in emotions than either laypeople or researchers previously thought.
In this paper, we introduce a psychological constructionist model of emotion that explains the mechanisms by which language plays a fundamental role in emotion. We begin our article by first providing a brief primer on the psychological constructionist approach we take in our own work called the Conceptual Act Theory (CAT; cf., Barrett, 2006b).
We outline the CAT’s predictions for the role of language in emotion and discuss early evidence that language does indeed play a role in emotion. To understand the ultimate and proximate mechanisms by which language plays a role in emotion, we next explore evidence from developmental and cognitive science, demonstrating that language helps humans acquire and then use concept knowledge to make meaning of their experiences and perceptions. We close by exploring the implications of language’s role in emotion concept acquisition and use for emotional experiences and perceptions.
Psychological Construction and the Conceptual Act Theory
The idea that language goes beyond describing emotion after the fact is consistent with psychological constructionist theories of emotion. Psychological construction is a family of theories that conceives of emotions as psychological “compounds” resulting from the combination of more basic psychological “elements” that are not themselves specific to emotions (Russell, 2003; Barrett, 2006b, 2013; Clore and Ortony, 2008, 2013; Cunningham et al., 2013; Lindquist, 2013; see Gendron and Barrett, 2009 for a historical account of psychological constructionist views).
All constructionist theories of emotion predict that psychological compounds such as anger, disgust, fear, etc. emerge when more basic psychological elements such as representations of the body, exteroceptive sensations (e.g., visual sensations; auditory sensations) and concept knowledge about emotion categories combine. Just as chemical compounds (e.g., NaCl) emerge from more basic elements and possess attributes that their constitutive elements do not—NaCl (sodium chloride, or commonly, table salt) has properties that are not reducible to either sodium, which is a member of the alkali metal family, or chlorine, which is a type of halogenic gas—psychological compounds such as emotions are more than the sum of representations of the body, exteroceptive sensations, and concept knowledge. Most psychological constructionist views agree that a person experiences an emotion when concept knowledge (e.g., knowledge about “fear”) and exteroceptive sensations (e.g., the sights and sounds of being in a dark alley) are used to make meaning of body states (e.g., a beating heart, sweaty palms, and feelings of startle) in a given instance.
A person sees someone else as emotional when concept knowledge (e.g., knowledge about “fear”) and exteroceptive sensations (e.g., the sights and sounds of riding a roller coaster) are used to make meaning of someone else’s affective bodily and facial muscle movements (e.g., a person’s wide eyes, gaping mouth, and white knuckles). Our own psychological constructionist approach, the CAT (Barrett, 2006a, 2009, 2012; Wilson-Mendenhall et al., 2011; Lindquist and Barrett, 2012; Lindquist, 2013) specifically predicts a role for language in this process, insofar as language supports the acquisition and use of concept knowledge (e.g., the concept of “fear”) that is used to make sensations meaningful as emotions.
Language and the Acquisition of Concept Knowledge in Adults
In an embodied account of concept knowledge, adults continue to update and refine categories based on on-going experiences of the perceptual world throughout their life (Schyns et al., 1998; Vigliocco et al., 2009; Barsalou, 2012). Growing evidence suggests that words play as much, if not more, of a role in adults’ acquisition of novel visual categories, even when words are redundant with other cues for learning. For instance, in one study documenting the role of language in adult category learning (Lupyan et al., 2007), participants learned to categorize novel “alien” stimuli as things to be approached or things to be avoided and received feedback on the accuracy of each response.
As participants received feedback about the accuracy of their judgment, participants in the label condition also saw a nonsense word; participants in the control condition received no word. Even though words were not necessary for the task, those participants who saw nonsense words while learning to categorize the stimuli were later better able to differentiate between members of different categories than were individuals who did not. Redundant words facilitated learning regardless of whether they were presented visually or played aurally during learning.
Despite research on the role of words in general adult concept acquisition, very little work has specifically assessed how words help adults learn novel emotion concepts. Indeed, it is hard to conduct this research because most healthy adults (who are not alexithymic) already possess substantial knowledge about the feelings, situations, behaviors, and bodily changes that accompany the emotion categories encoded by their acquired language.
However, one study addressed the role of language in the perception of emotion in a category-learning task involving novel Chimpanzee affective facial actions that were unfamiliar to most participants (Fugate et al., 2010). In the first phase of the experiment adults simply viewed pictures of unfamiliar Chimpanzee facial actions (e.g., a “bared teeth” or “scream” face) or viewed the faces while learning to associate them with nonsense words. Participants were later shown two images taken from a continuous morphed array of two facial expressions (e.g., an image of a face containing a percentage of both the bared teeth expression and scream expression) and were asked to indicate whether two faces from random points throughout the array were similar to one another or different.
This was a classic measure of “categorical perception” (Goldstone, 1994), the ability to perceive categories within a continuous dimension of sensory information. On some trials, participants compared faces that did not cross one of the learned category boundaries (e.g., they compared an 86% bared teeth, 14% scream expression with a 71% bared teeth, 29% scream expression), whereas on others, they compared faces that did cross a learned category boundary (e.g., compared a 43% bared teeth, 57% scream expression with a 29% bared teeth, 71% scream expression). If participants demonstrated categorical perception, they would see the first set of faces as similar but the second set of faces as different. Yet only participants who learned to associate the faces with words in the first phase of the experiment demonstrated such categorical perception. Participants who did not learn to associate faces with a label did not perceive a categorical distinction between the faces.
Building on these findings, a recent study from our laboratory suggests that language can even help adults acquire and assimilate new perceptual experiences into existing category knowledge about emotional facial expressions (Doyle and Lindquist, in preparation). During a learning phase, participants saw a series of non-stereotypical posed facial expressions of anger (e.g., a scowl and squinted eyes with raised eyebrows) and fear (e.g., an open mouth and wide eyes with furrowed eyebrows). In one between-subjects condition, participants learned to associate these facial expressions with emotion words (“anger” vs. “fear”).
In another, participants studied the faces and performed perceptual judgments (whether the eyes were close together vs. far apart). In a target phase, participants next studied target individuals who were depicting stereotypical facial actions for either anger or fear and were asked to categorize the facial expression as “anger” or “fear.” During a final test phase, participants were asked to identify which face the target individual had been making during the target phase (i.e., either the learned face, the target face, or a morphed combination of the two).
Consistent with the idea that language helps adults acquire and assimilate new perceptual instances into existing category knowledge, participants who had paired faces with words in the learning phase were more likely to remember seeing a target face that was similar to the learned category information. These findings suggest that language helps acquire novel category knowledge that biases memory of later novel faces.
Together, these early findings point to the idea that language continues to help adults acquire novel category knowledge across the lifespan and to update existing category knowledge. This may be how adults continue to augment their existing category knowledge about emotion and suggests that at any point in time, adults’ category knowledge about emotion may reflect the regularities present in the local environment (e.g., one’s cultural, social, or familial context). For example, if concept knowledge is always being updated and changed, then an adult’s knowledge about say, anger, may be impacted by the last time the person experienced an instance of anger (e.g., at a spouse).
This concept knowledge may thus feed-forward to impact situated conceptualizations of future instances of body states when with a spouse, potentiating the situated conceptualization of anger over, say, anxiety or even other body states such as hunger (e.g., a person might conceptualize her unpleasant feelings around dinner time as anger toward her spouse as opposed to hunger for the impending meal).
Thus, the CAT predicts that language does more than just help acquire concept knowledge. It further predicts that language supports the accessibility and use of existing concept knowledge as humans make meaning of sensations in the body or world during the construction of emotions.
This prediction is consistent with growing evidence from cognitive science that language, once connected to certain perceptual representations that become stored as conceptual knowledge, alters on-going adult perception by selecting certain sensations for conscious awareness while suppressing other sensations from conscious awareness.
reference link :https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4396134/
reference link:Source: The Conversation