All sectors of the national critical infrastructure are expected to apply secure artificial intelligence (AI) in their daily operations, and benefit from AI-based decision-making tools and AI-human systems. These outcomes require a thorough understanding of human behavior in different environments. However, for mathematical convenience, existing cybersecurity education approaches assume humans always make rational decisions. Important factors, such as confounding variables, are often ignored. In addition, there is an emphasis on learning to conduct correlation and association analyses, and insufficient attention paid to learning causation analysis. This project will develop and evaluate educational modules that will prepare a new generation of engineering and computer science (CS) students to develop realistic computational models of decision-making. The proposed activities will advance cybersecurity education from association to causation analysis and contribute to the goal of achieving human-level AI.

This project will address two fundamental challenges in cybersecurity, privacy, and AI education. First, the project will investigate how engineering and CS students can be prepared to learn cybersecurity and privacy behaviors computationally. Students will learn to apply advanced AI methods to develop realistic computational models of decision-making that address both affective and cognitive processes. Second, the project will seek to understand how causal (vs. correlative) models in cybersecurity and privacy can be developed. The project team will provide opportunities for students to develop causal networks vs. traditional correlation networks. Course modules based on this research will be implemented and evaluated in existing advanced undergraduate/graduate courses at the Georgia Institute of Technology. The project will assess the impact of these modules on students' understanding on the role of AI in addressing cybersecurity and privacy issues.

This project is supported by a special initiative of the Secure and Trustworthy Cyberspace (SaTC) program to foster new, previously unexplored, collaborations between the fields of cybersecurity, artificial intelligence, and education. The SaTC program aligns with the Federal Cybersecurity Research and Development Strategic Plan and the National Privacy Research Strategy to protect and preserve the growing social and economic benefits of cyber systems while ensuring security and privacy.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

Agency
National Science Foundation (NSF)
Institute
Division of Graduate Education (DGE)
Type
Standard Grant (Standard)
Application #
2041788
Program Officer
Nigamanth Sridhar
Project Start
Project End
Budget Start
2020-09-01
Budget End
2022-08-31
Support Year
Fiscal Year
2020
Total Cost
$299,777
Indirect Cost
Name
Georgia Tech Research Corporation
Department
Type
DUNS #
City
Atlanta
State
GA
Country
United States
Zip Code
30332