Adultes are able to identify anger, grief, jealousy, fear, joy, or other emotions when we they it. Their knowledge about each emotion they identify then guides their thoughts and actions. This research project explores how children develop the ability to identify emotions and obtain knowledge about them. The project's central hypothesis is that concepts for anger, grief, jealousy, fear, etc. can be thought of as scripts. For example, the (adult) script for fear can be considered a sequence of elements: a danger is recognized (cause), the face and voice broadcast fear (expression), the body reacts (physiology), the person feels bad (subjective feeling) and flees (behavioral consequence). The child must build this script element by element and learn a label for it. Thus, at different ages, children have different scripts for the same emotion. For many years, the common assumption has been that children are born with, or quickly develop, concepts for at least the basic emotions tied to facial expressions. Recent research, however, has challenged this assumption. For example, preschool children interpret the facial expression of disgust as anger rather than as disgust -- even though they know the word disgust. Asked to find the angry person in a crowd, toddlers select all persons showing any negative facial expression. It is not known how emotion scripts begin, when elements enter the script, and what causes each element to enter the script. This project will study the earliest emotion categories, investigating the idea that infants and toddlers may simply divide all emotions into positive and negative. The project will then study what drives the child to divide "negative emotion" into more specific ones. A likely candidate is the difference between angry and sad facial expressions, but subsequent divisions may be based on differences in cause or behavioral consequence. For example, the distinction between anger and fear may be based on whether the person acts in a hostile way or flees. The project will also ask how children deal with a common situation: different cues to the other's emotion conflict (e.g., someone cries, but says, "I'm fine").

Emotional, social, and academic problems throughout childhood are known to be associated with deficits in understanding emotions (in "emotional intelligence"). Children, it seems, need to understand the emotions they witness and experience. Yet, the early development of this understanding is a curiously neglected area of study, making prevention and intervention difficult. Parents and teachers need basic information on what children mean by the emotion words they use and how they interpret the signs of emotion they witness. Through the Emotion Development Lab at Boston College, diverse undergraduate, graduate, and post-doctoral students will work with Dr. James Russell to investigate these questions of emotion understanding in young children. Empirically supported information about what children know and when they know it will inform parents and teachers how to guide children as they come to understand the emotions of those around them.

Project Report

The research supported by this grant provided a data-driven description of the growth of children’s system of mental categories for emotion and the events that access those categories. Understanding of emotion has been implicated in preschoolers’ cognitive and linguistic development, their health and their later school-readiness. Acquisition of emotion understanding is part of the development of emotional intelligence. The model that we have been developing, and that was tested and extended in the current research, proposes that children initially understand emotion in terms of broad valence based categories (feels good vs. feels bad). With age and experience, children gradually differentiate these broad categories into narrower categories that more closely resemble adult categories. Our findings indicate that children’s emotion categories are more influenced by the social environment, rather than innate categories based on facial expressions. Although not directly tested by the supported studies, the results supported the idea that adult emotion categories are not universal. Rather, they vary by individual, group, and cultural differences. This research complemented our larger project on emotion concepts in children and adults and in people from different cultures. It also supported the theory that the experience of emotion is psychologically constructed rather than fixed by nature. The goal of the studies in the 4 series in this grant was to identify the "seeds" of children’s emotion concepts and then to investigate how their emotion scripts develop. Each series investigated children’s emotion concepts from a different angle. In the first series of studies, we investigated children’s initial, broad, dimensional emotion concepts. The goal was to identify the scales for pleasure and arousal that young children could use most easily. Two-year-olds can use dichotomous scales of valence (feels good, feels bad) more reliably than a 5-point scale to respond to emotion labels and facial expressions. Three-year-olds, but not 2-year-olds, can also use a dichotomous arousal scale (active, relaxed) to respond to emotion labels and facial expressions. By 4 years of age, children can use the 5-point scale to respond to simple drawings of facial expressions, photographs of facial expressions, and emotion labels. We then moved from children’s use of broad-valence-based categories to an investigation of the "seeds" of a discrete emotion categories. The goal was to identify the cue that enables children to differentiate specific categories from the initial broad negative category (feels bad). Children have different seeds for sadness than for anger: For sadness, the facial expression is the first cue that helps children differentiate sadness from other negative emotions; for anger it is less clear but may be the behavioral consequence. In a third set of studies, we investigated which cues to emotion (e.g., face vs. story) children attended to when the cues were mismatched (e.g., scared face, sad story). We predicted that, when presented with mismatched pairs, children (3-7 years) would label the stories when both cues were negative but the face when one cue was positive and the other negative. For mismatched faces and stories, the prediction was supported for same valence pairs (e.g. for a scared face and sad story, children labeled the pair sad). This finding supports prior evidence that the face is malleable to other emotion cues when both are of the same valence. When the pair was opposite-valence (e.g., scared face and happy story), the prediction was not upheld. Instead, regardless of whether the face or story was happy, children labeled the pair as happy. This finding suggests a happiness dominance effect: Children focus on the happy cue. In the final set of studies, we tested the common assumption that children would be more likely to understand dynamic emotional expressions (e.g., videos of actors expressing an emotion in real time) compared the traditional still photographs of facial expressions. On the one hand, dynamic expressions may be more similar to children’s everyday experience, increasing the likelihood that they will recognize them. On the other hand, still photographs of emotional expressions capture the peak moment of the expression and give children ample opportunity to look at and interpret the expression. Counter to both of these views,, we found no significant differences overall in children’s use of the expected labels for still photographs vs. dynamic videos. In addition, when dynamic cues were compared to brief stories describing the causes and consequences of the corresponding emotions, children were more likely to use the expected label the story for fear, disgust (supporting prior research comparing still photographs and stories, Widen & Russell, 2010a, 2010b). Additional publications: Widen, S. C., & Russell, J. A. (in press). Do Dynamic Facial Expressions Convey Emotions to Children Better than Do Static Ones? Journal of Cognition and Development.

Agency
National Science Foundation (NSF)
Institute
Division of Behavioral and Cognitive Sciences (BCS)
Application #
1025563
Program Officer
Laura Namy
Project Start
Project End
Budget Start
2010-09-15
Budget End
2014-08-31
Support Year
Fiscal Year
2010
Total Cost
$300,000
Indirect Cost
Name
Boston College
Department
Type
DUNS #
City
Chestnut Hill
State
MA
Country
United States
Zip Code
02467