The World Wide Web and other networked information systems provide enormous benefits by enabling access to unprecedented amounts of information. However, for many years, users have been frustrated by the fact that these systems also create significant problems. Sensitive personal data are disclosed, confidential corporate data are stolen, copyrights are infringed, and databases owned by one government organization are accessed by members of another in violation of government policy. The frequency of such incidents continues to increase, and an incident must now be truly outrageous to be considered newsworthy. This project takes the view that when security violations occur, it should be possible to punish the violators in some fashion.

Although "accountability" is widely agreed to be important and desirable, there has been little theoretical work on the subject; indeed, there does not even seem to be a standard definition of "accountability," and researchers in different areas use it to mean different things. This project addresses these issues, the relationship between accountability and other goals (such as user privacy), and the requirements (such as identifiability of violators and violations) for accountability in real-world systems. This clarification of the important notion of accountability will help propel a next generation of network-mediated interaction and services that users understand and trust.

The project's technical approach to accountability as an essential component of trustworthiness involves two intertwined research thrusts. The first thrust focuses on definitions and foundational theory. Intuitively, accountability is present in any system in which actions are governed by well defined rules, and violations of those rules are punished. Project goals are to identify ambiguities and gaps in this intuitive notion, provide formal definitions that capture important accountability desiderata, and explicate relationships of accountability to well studied notions such as identifiability, authentication, authorization, privacy, and anonymity. The second thrust focuses on analysis, design, and abstraction. The project studies fundamental accountability and identifiability requirements in real-world systems, both technological and social. One project goal is to use the resulting better understanding of the extent to which accountability is truly at odds with privacy and other desirable system properties to design new protocols with provable accountability properties. Building on that understanding and insights gained in designing protocols, the project also addresses fundamental trade-offs and impossibility results about accountability and identifiability in various settings. The broader impacts of the work include not only engagement with students but also a new perspective on real world accountability in trustworthy systems.

Project Report

The World Wide Web and other networked information systems provide enormous benefits by enabling access to unprecedented amounts of information. However, for many years, users have been frustrated by the fact that these systems also create significant problems. Sensitive personal data are disclosed, confidential corporate data are stolen, copyrights are infringed, and databases owned by one government organization are accessed by members of another in violation of government policy. The frequency of such incidents continues to increase, and an incident must now be truly outrageous to be considered newsworthy. This project took the view that, when security violations occur, the essential quality of "accountability" is that the violators are punished. Although "accountability" is widely agreed to be important and desirable, there has been little theoretical work on the subject; indeed, there does not even seem to be a standard definition of accountability, and researchers in different disciplines use it to mean different things. This project addressed the relationship between accountability and other goals (such as user privacy), the requirements (such as identifiability of violators and violations) for accountability in real-world systems, and the multidisciplinary connections needed to develop and apply these concepts in the real world. This clarification of the important notion of accountability can help propel a next generation of network-mediated interaction and services that users understand and trust. The most significant project outcome is a definitional framework for accountability, together with its application to the study of accountability in "open" and "closed" systems. This work shifts the focus from evidence and verdicts (which typically involve identification and identity) to punishment. The definitional framework includes a deterrence-based definition of accountability, formalized in utility-theoretic terms, that allows for exploration of the relationship between punishment and the strength of the connection between system users and "nyms" they may be known by in the system. We demonstrate the usefulness of the definitional framework by applying it to the study of "open" and "closed" systems for accountability -- that is, systems that differ in the extent to which they require a user to be bound to a nym in order to participate in the system. This represents a step toward the important goal of formally reasoning about the relationship between accountability and identity. Additional project results included a study of approximate privacy, exploration of so-called side channels in Facebook, and a game-theoretic consideration of when it can be useful and appropriate to collaborate with opponents while still working to defeat them in the long run. Because accountability is a multidisciplinary area that includes both technical and policy aspects, a major part of the project involved organizing, participating in, and giving presentations at events bringing together specialists from a variety of relevant disciplines. For example, the project team gave number of presentations to audiences including government organizations, representatives of the financial services industry, computer scientists and social scientists, and high school students. We also co-organized the DIMACS/BIC/A4Cloud/CSA International Workshop on Trustworthiness, Accountability and Forensics in the Cloud (TAFC) and the Second International Workshop on Accountability: Science, Technology, and Policy (ASTP). TAFC brought together experts from computer science and other disciplines to discuss collectively how public and private sectors as well as the research community can increase the confidence in the use of cloud computing to deploy and use innovative services by citizens and businesses. ASTP brought together experts from multiple academic disciplines, government, and industry to discuss accountability models and analysis; law, public-policy, government, and industry perspectives; approaches to conceptualizing accountability; systems aspects of accountability; and aspects of health-care policy and systems.

Agency
National Science Foundation (NSF)
Institute
Division of Computer and Network Systems (CNS)
Type
Standard Grant (Standard)
Application #
1018557
Program Officer
Sylvia Spengler
Project Start
Project End
Budget Start
2010-08-01
Budget End
2014-07-31
Support Year
Fiscal Year
2010
Total Cost
$249,991
Indirect Cost
Name
Rutgers University
Department
Type
DUNS #
City
Piscataway
State
NJ
Country
United States
Zip Code
08854