In recent years, machine learning (ML) and artificial intelligence (AI) applications are quickly finding their ways into our everyday life. All of these applications generate and inject a massive volume of data into the network for a wide range of complex ML/AI data analytics tasks, including but not limited to the training and/or inferences in computer vision, natural language processing, recommendation systems, etc. However, most of the existing wireless network control and optimization algorithms rarely take the new characteristics of ML/AI data analytic traffic into considerations. Likewise, most ML/AI data analytics algorithms oversimplify the underlying wireless networks as "bit pipes" and ignore their complex networking and physical layer constraints, hence leading to poor overall data analytics efficiency. The overarching theme of this CAREER research program is to bridge the gap between the rapidly growing ML/AI data analytics demands and the existing networking and communication technologies. The principal investigator (PI) explore a cross-disciplinary understanding between wireless networking and data analytics through a unified research program, which consists of the development of tractable theoretical models, exploration of theoretical performance bounds and limits, and the development of low-complexity distributed algorithms and protocols that are easy to implement in practice.

In this CAREER program, the PI will develop networking-computing co-designs to facilitate ML/AI data analytics with data and model parallelisms in wireless edge networks. The PI will focus on three complementary research thrusts, each of which addresses one key aspect in supporting distributed data analytics at a different protocol layer: (i) communication-efficient distributed optimization at the physical layer; (ii) joint-queueing-computing scheduling at the medium access control layer; and (iii) admission control and resource virtualization at the transport layer. Collectively, the results in this research contribute to a new direction of wireless network control and optimization theory and systems design. The proposed research will serve as a foundation of the next-generation wireless networking that supports a plethora of data analytics and ML/AI applications. Due to its unique scientific and engineering challenges, this research program encompasses strong and holistic expertise in mathematical modeling, optimization, control, queueing theory, stochastic analysis, as well as deep knowledge of ML/AI system operations in practice. The proposed research will support not only the networking, communications, control, and machine learning research communities, but also the general public, by developing new optimization technologies for substantially improved network and data analytics performances.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

Agency
National Science Foundation (NSF)
Institute
Division of Computer and Network Systems (CNS)
Application #
2110259
Program Officer
Alhussein Abouzeid
Project Start
Project End
Budget Start
2020-10-01
Budget End
2025-09-30
Support Year
Fiscal Year
2021
Total Cost
$109,918
Indirect Cost
Name
Ohio State University
Department
Type
DUNS #
City
Columbus
State
OH
Country
United States
Zip Code
43210