With the exponential increase in cybercrimes in recent years, the need for Computer Forensics and Digital Evidence (CFDE) expertise is rapidly growing. A qualified CFDE professional needs to have deep knowledge of digital forensic evidence identification, acquisition, and examination, as well as the ability to present and explain digital forensic evidence in courtrooms. However, there are major barriers to instilling the core knowledge of CFDE and practice of cyber investigation techniques in a diverse body of interested students. For example, a systematic approach for collecting, organizing, and analyzing digital forensic evidence is lacking. This project will engage novel interdisciplinary perspectives, including artificial intelligence (AI), cybersecurity, criminal justice, and computer science to re-examine the emerging CFDE field with a formal approach. This project will then explore visualized and explainable AI to improve students’ learning experience in digital forensics education at Minority-Serving Institutions (MSIs) including Historically Black Colleges and Universities (HBCUs).

The project brings together faculty from the University of Baltimore, an MSI, Bowie State University, one of the oldest HBCUs in Maryland, and the University of Missouri Kansas City, who have synergistic expertise in digital forensics, cybersecurity, AI, law, and computer science. The project will leverage graph-based AI models to provide students with visualized depictions of forensic evidence, the patterns of evidence, and the connections among the evidence. It will also explore explainable AI to support the development of forensic evidence that is accountable and presentable to courts, and develop AI-aided CFDE instructional materials. The project will address research questions at the intersection of AI, CFDE, and education including the following: (a) How do graph-based models store, retrieve, and present digital forensic evidence? (b) How do graph-based AI models discover new evidence and to what extent should we trust AI-discovered evidence/patterns? (c) How can knowledge and techniques of AI-assisted investigation be infused into CFDE instructional materials, and to what extent do the materials improve students’ learning experiences? Learning materials will be made available to both the CFDE and data science communities.

This project is supported by a special initiative of the Secure and Trustworthy Cyberspace (SaTC) program to foster new, previously unexplored, collaborations between the fields of cybersecurity, artificial intelligence, and education. The SaTC program aligns with the Federal Cybersecurity Research and Development Strategic Plan and the National Privacy Research Strategy to protect and preserve the growing social and economic benefits of cyber systems while ensuring security and privacy.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

Agency
National Science Foundation (NSF)
Institute
Division of Graduate Education (DGE)
Type
Standard Grant (Standard)
Application #
2039289
Program Officer
Li Yang
Project Start
Project End
Budget Start
2021-04-01
Budget End
2023-03-31
Support Year
Fiscal Year
2020
Total Cost
$145,000
Indirect Cost
Name
University of Baltimore
Department
Type
DUNS #
City
Baltimore
State
MD
Country
United States
Zip Code
21201