This collaborative project will study how signs convey meaning in American Sign Language (ASL) by analyzing the semantic organization of the ASL lexicon. Understanding the structure and organization of the human lexicon is critical to both linguistic and psycholinguistic theories of language. However, current theories are predominantly built upon evidence from spoken languages, and may underrepresent characteristics that are particularly common to sign languages. For example, a core assumption regarding the organization of the lexicon is that there is a sharp separation between semantic structure and phonological (form) structure--the way words are pronounced is generally thought to be unrelated to what they mean. However, mounting evidence suggests that iconicity, words or signs that resemble their meaning, is pervasive in both signed and spoken languages. Examples of iconicity in English are words like "ping" and "sizzle" that sound like what they mean; examples of iconicity in ASL are signs like DRINK and HAMMER which look like what they mean. While semantic and phonological structure might not be fully independent from each other in ASL, we know relatively little about how they relate to one another and whether or how iconicity may shape the lexicon.

This project represents the first comprehensive quantitative analysis of the semantic organization of the ASL lexicon. The project will collect valuable information about the semantic similarity of ASL signs and the size of semantic neighborhoods, which will be key to uncovering how knowledge about sign meaning is stored and organized, as well as how this structure is acquired. Specifically, this project aims to 1) conduct a lexicon-wide evaluation of the semantic associations between signs, 2) characterize iconic and non-iconic systematic relationships between form and meaning using visualization techniques inspired by network science, and 3) implement a novel approach to quantify iconicity in a subset of the lexicon in an effort to understand which semantic features participate in iconic mappings and how iconicity might shape semantic processing. The data collected under this project will be integrated into a large interactive lexical database of the semantic, phonological, and iconic structure that is publicly available (ASL-LEX: http://asl-lex.org/). These materials constitute essential tools that will allow scientists and educators to create well-controlled stimuli for use in research and the classroom. Finally, it is important to recognize that deaf people often have difficulty pursuing research careers because of communication roadblocks that hamper interaction with hearing scientists. The researchers on this project have "deaf-friendly labs" (e.g., project staff are fluent in ASL) and provide training that facilitates the entrance of deaf students into scientific and academic fields. Thus, a parallel aim of the project is to increase the representation of deaf people in science by including deaf researchers on the project and by providing an accessible environment for deaf students to gain training and research experience.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

Agency
National Science Foundation (NSF)
Institute
Division of Behavioral and Cognitive Sciences (BCS)
Type
Standard Grant (Standard)
Application #
1918252
Program Officer
Tyler Kendall
Project Start
Project End
Budget Start
2019-09-01
Budget End
2023-08-31
Support Year
Fiscal Year
2019
Total Cost
$317,838
Indirect Cost
Name
Boston University
Department
Type
DUNS #
City
Boston
State
MA
Country
United States
Zip Code
02215