Differential privacy is an elegant formulation of conditions an algorithm must meet in order to avoid leaking personal information contained in its inputs. It is becoming mainstream in many research communities and has been deployed in practice in the private sector and some government agencies. Accompanying this growth is an unfortunate, but expected, difficulty: many algorithms that are claimed to be differentially private are incorrect. This phenomenon affects both new-comers and seasoned veterans of the differential privacy field because of the difficulty and subtlety of developing new differentially private algorithms.

This proposal outlines a research plan for the development of a system called DevDP, whose purpose is to enable novice and expert users to develop prototypes and explore differentially private algorithms. In particular, the project will develop program analysis tools and theory that leverage both programming language and machine learning technology to aid the development of correct differentially private programs by automating much of the verification and reasoning about errors. As part of broader impacts, DevDP also has the potential to help educate students and less-technical members of the scientific community by providing interactive software tools. A solid understanding of differential privacy will become crucial as it makes its way into public policy.

Agency
National Science Foundation (NSF)
Institute
Division of Computer and Network Systems (CNS)
Type
Standard Grant (Standard)
Application #
1702760
Program Officer
Sol Greenspan
Project Start
Project End
Budget Start
2017-07-01
Budget End
2021-06-30
Support Year
Fiscal Year
2017
Total Cost
$1,200,000
Indirect Cost
Name
Pennsylvania State University
Department
Type
DUNS #
City
University Park
State
PA
Country
United States
Zip Code
16802