Everyday social activities such as toy play with parents are the context for learning as it unfolds in real time. A well coordinated child-caregiver interaction seems likely to lead to better learning while a decoupled or non-coordinated interaction may disrupt learning and development. Both parent and child play an active role in early communication and word learning as children """"""""signals"""""""" their choices of communication and also determine what environmental information is most relevant to their own developmental needs, and as parents react to those signals in a sensitive manner and provide relevant information to ease the challenge of children's matching linguistic symbols to their referents. The goal of the proposed research is to achieve a deeper understanding of the sensorimotor basis of early social coordination and its potentially critical roles in later language learning an other development milestones. Toward this goal, the proposed research has three key components: 1) a set of longitudinal and cross-sectional experiments will collect multiple streams of sensorimotor data from child-parent toy play to discover fine-grained patterns characteristic of early developmental changes in child-parent social interactions which will provide new evidence on the developmental origins of these skills;2) we will link sensorimotor dynamics in child-parent interaction with standardized, highly reliable behavioral measures that have been widely used, with the goal to understand how children's moment-to-moment social interactions with social partners may build generalizable word learning skills;3) we will link social coordination in toy play with parental responsiveness and individual differences in development milestones, which will provide deeper insights into the consequential and longer-term role of early parent-child interactions in developmental process.

Public Health Relevance

Early social coordination in child-parent interaction plays an important role in child development by establishing a social-interactive foundation for smooth child-parent interaction through which children learn about the world. Therefore, the proposed research has significant relevance for the early detection of early social communication problems based on quantitative behavioral patterns, for understanding individual differences in social interaction and word learning, for studying their cascading consequences in social, cognitive and linguistic development, and for developing new therapies and interventions that generalize to the complex natural environment.

Agency
National Institute of Health (NIH)
Institute
Eunice Kennedy Shriver National Institute of Child Health & Human Development (NICHD)
Type
Research Project (R01)
Project #
5R01HD074601-02
Application #
8696876
Study Section
Language and Communication Study Section (LCOM)
Program Officer
Griffin, James
Project Start
2013-07-10
Project End
2018-05-31
Budget Start
2014-06-01
Budget End
2015-05-31
Support Year
2
Fiscal Year
2014
Total Cost
Indirect Cost
Name
Indiana University Bloomington
Department
Psychology
Type
Schools of Arts and Sciences
DUNS #
City
Bloomington
State
IN
Country
United States
Zip Code
47401
Clerkin, Elizabeth M; Hart, Elizabeth; Rehg, James M et al. (2017) Real-world visual statistics and infants' first-learned object names. Philos Trans R Soc Lond B Biol Sci 372:
Chen, Chi-Hsin; Zhang, Yayun; Yu, Chen (2017) Learning Object Names at Different Hierarchical Levels Using Cross-Situational Statistics. Cogn Sci :
Yu, Chen; Smith, Linda B (2017) Multiple Sensory-Motor Pathways Lead to Coordinated Visual Attention. Cogn Sci 41 Suppl 1:5-31
Chen, Chi-Hsin; Yu, Chen (2017) Grounding statistical learning in context: The effects of learning and retrieval contexts on cross-situational word learning. Psychon Bull Rev 24:920-926
Yu, Chen; Smith, Linda B (2017) Hand-Eye Coordination Predicts Joint Attention. Child Dev 88:2060-2078
Yu, Chen; Smith, Linda B (2016) The Social Origins of Sustained Attention in One-Year-Old Human Infants. Curr Biol 26:1235-40
Xu, Tian Linger; Zhang, Hui; Yu, Chen (2016) See You See Me: the Role of Eye Contact in Multimodal Human-Robot Interaction. ACM Trans Interact Intell Syst 6:
Bambach, Sven; Lee, Stefan; Crandall, David J et al. (2015) Lending A Hand: Detecting Hands and Recognizing Activities in Complex Egocentric Interactions. Proc IEEE Int Conf Comput Vis 2015:1949-1957
Chen, Chi-Hsin; Yu, Chen (2015) The Effects of Learning and Retrieval Contexts on Cross-situational Word Learning. IEEE Int Conf Dev Learn Epigenetic Robot 2015:202-207
Zhang, Yayun; Yurovsky, Daniel; Yu, Chen (2015) Statistical Word Learning is a Continuous Process: Evidence from the Human Simulation Paradigm. Cogsci 2015:2793-2798

Showing the most recent 10 out of 17 publications