This supplement will address the unintended consequences and collateral damage that arise when facial recognition software is used for medical purposes, such as for syndrome diagnosis as in our Facebase-funded project.
In Aim 1, we will determine whether the accuracy of this technology varies based on self-reported race, sex and age. In this aim, we examine our existing database for evidence of bias based on self-reported race, sex or age. We further determine the extent to which these variables influence classification performance. To the extent sample sizes allow, we will carry this analysis to the level of specific syndromes. Finally, we will use anonymized reference datasets of non-syndromic faces to compare false positive rates based on NIH race definitions, sex and age. The outcome of this aim is to objectively establish bias and estimate the effects of under-representation across race, age and sex categories within our data.
In Aim 2, we will determine how the reports of race-, sex- and age-based bias in facial recognition technology may influence views of the technology and its application amongst researchers and clinicians.
This aim will establish the extent to which the storing of large databases of facial images and the application of machine learning processes to them for diagnostic purposes may raise privacy concerns. The concerns investigated will include potential hacks into protected health information; fear relating to the bias in some facial recognition software (and, potentially, in the Facebase database); and fear of discrimination in the application of the technology, such as by insurers. The outcome will be a white paper that targets a high-profile journal, summarizing the findings and defining crucial issues that should guide the development of facial imaging for disease diagnosis and clinical usage.

Public Health Relevance

Our supplement application will address the important question of unintended consequences and collateral damage when facial recognition software is used for medical purposes, such as for syndrome diagnosis as in our Facebase-funded project. The use of large facial recognition databases in medicine represents a frontier that arrives with tremendous potential but undeniable risks. Our central aims are: (1) determine whether the accuracy of this technology varies based on self-reported race, sex and age; and (2) determine how the reports of race-, sex- and age-based bias in facial recognition technology may influence views of the technology and its application amongst researchers and clinicians.

Agency
National Institute of Health (NIH)
Institute
National Institute of Dental & Craniofacial Research (NIDCR)
Type
Research Project--Cooperative Agreements (U01)
Project #
3U01DE028729-02S1
Application #
10132648
Study Section
Special Emphasis Panel (ZDE1)
Program Officer
Wang, Lu
Project Start
2019-08-01
Project End
2024-07-31
Budget Start
2020-08-03
Budget End
2021-07-31
Support Year
2
Fiscal Year
2020
Total Cost
Indirect Cost
Name
University of Southern California
Department
Biostatistics & Other Math Sci
Type
Biomed Engr/Col Engr/Engr Sta
DUNS #
072933393
City
Los Angeles
State
CA
Country
United States
Zip Code
90089