3D image segmentation is an important and ubiquitous task in image-oriented scientific disciplines, particularly biomedicine, where images provide the basis for biological discovery. While imaging techniques reveal spatial content and activities within an entire subject, ultimately biologists are interested in specific anatomical structures (e.g., organs, tissues, cells, etc.). Delineation of the structures of interest within a given set of images is therefore a typical first-step in the data-to-knowledge pipeline, with both the efficiency and accuracy of segmentation critically affecting how the data is utilized in research and clinical practice. Creating accurate segmentations, particularly for 3D biomedical images, is a non-trivial task that calls for cooperation between humans and computers. While human experts, with their superior visual perception skills and vast knowledge and experience acquired from years of training, ultimately decide what constitutes an accurate segmentation, they lack the objectivity or efficiency of computational algorithms. On the other hand, without expert guidance, segmentation algorithms easily fail in the presence of the noise and ambiguity that are inevitable in biomedical images. In this research the PIs will investigate 3D image segmentation as a human-computer interaction paradigm to better understand the human factors that are involved in the current segmentation process, with the goal of making the process more efficient, accurate and repeatable. The team's hypothesis is that the segmentation process could be significantly improved through a deeper understanding of how people perform low-level perception and cognition tasks in the context of 3D segmentation (e.g., visual cues, delineation of structures by marks, and local accuracy or quality criteria), and how domain experts wish to specify high-level segmentation constraints (e.g., connectivity, topology, and shape). To test this hypothesis the PIs will analyze the segmentation process by domain experts that span a reasonable subspace of the actual segmentors and segmentation tasks in biology and clinical practice, to define a conceptual framework that captures the low-level perception and cognitive elements of segmentation as well as the higher-level information related to navigation, marking, and inspection. Building upon and instantiating the framework, the team will work with experts to develop a prototype segmentation tool that explores novel interaction and visualization paradigms as well as their supporting algorithms. The prototype tool will be used to both verify the conceptual framework and to create a more effective practical solution to segmentation.

Broader Impacts: By formulating and studying segmentation as a human perception and cognitive task, this work represents a major departure from existing research on either segmentation algorithms or tools. The resulting conceptual framework will serve as a bridge between the two communities, leading both to better designs for current and future segmentation tools and the framing of new problems for segmentation algorithms. For end users, the working prototype will support a more effective segmentation experience that is powered by the underlying conceptual framework. Furthermore, formalizing the kinds of perceptual cues and conceptual models users have when approaching the segmentation problem will serve as a useful test case for understanding the more general question of how perception and cognition interact when they are re-mapped to solve a problem they were never designed for. To disseminate the findings of this research, the PIs will release their working prototype as an open-source project, which can then serve as a shared communication platform between algorithm developers, tool developers, and end users.

Agency
National Science Foundation (NSF)
Institute
Division of Information and Intelligent Systems (IIS)
Type
Standard Grant (Standard)
Application #
1302248
Program Officer
Ephraim Glinert
Project Start
Project End
Budget Start
2013-06-15
Budget End
2018-05-31
Support Year
Fiscal Year
2013
Total Cost
$296,270
Indirect Cost
Name
University of North Texas
Department
Type
DUNS #
City
Denton
State
TX
Country
United States
Zip Code
76203