An accurate map showing usage of the wireless spectrum can be used to inform spectrum policy, determine usage patterns and situational awareness, and evaluate the feasibility of opportunistic spectrum access methods indoors and outdoors, especially in the crowded sub-6 GHz spectrum. However, the success of creating such a map depends on the quality and quantity of sensors measuring spectrum activity, and how their measurements are processed by a centralized server. In general, because the sensors operate in an unknown or time-varying environment, many distributed sensors are generally needed to compensate for this uncertainty. As a result, the cost, power consumption, and size of the sensors need to be very low to allow wide-scale deployment. This project examines the sensor quality versus quantity tradeoff for a large-scale distributed measurement system that infers wireless activity throughout a region. Knowledge of the activity of the spectrum is hampered by the lack of accurate real-time measurements on a truly large scale and deploying a large number of sensors is hampered by cost, power, and complexity. The research addresses the important aspect of how to commoditize the hardware and drive the cost and power down by an order of magnitude, thus potentially paving the way to large spectrum data sets. The work will be used to seed projects at an existing Research Experience for Undergraduates (REU) site at the University of Notre Dame. The results will be disseminated to the research community to encourage discussions about how spectral occupancy can be measured reliably in a low-cost and scalable manner.

A spectrum sensor is, at its heart, a power detector for frequency bands of interest. Traditionally, such a sensor is evaluated as a radio since it generally tunes and down-converts its input to an intermediate-frequency or baseband signal, digitizes, filters, and then analyzes the power spectrum. The standard performance specifications of a radio include sensitivity, linearity, bandwidth, image-rejection, frequency stability, phase noise, and dynamic range. Meeting stringent requirements on these specifications contributes directly to cost and power consumption. However, spectrum mapping system performance is also determined by the ability of the centralized server to infer activities over a whole region from a widely distributed set of sensors. This research focuses on relaxing many sensor requirements, and analyzing how a real-time spectrum mapping system that uses machine-learning performs when the number of sensors is increased to compensate for the reduced capability per sensor. The effort seeks to show that there is an advantageous trade-off of quantity over quality---the information learned from additional sensors in a large-scale deployment easily compensates for their reduced individual capabilities, especially when accounting for cost and power consumption. Implications for computational and network load requirements and sensor size are also considered. This research effort is an innovative whole-system approach to determining the quality-quantity contour for distributed spectrum measurements. The effort will: (i) establish models, metrics, and fidelity criteria for the performance of a machine-learning radio-frequency spectrum sensing system with distributed sensors; (ii) examine opportunities to dramatically reduce sensor cost and power consumption; (iii) build an extensive test and measurement platform of sensors to validate the assumptions, analyses, methods and conclusions.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

Project Start
Project End
Budget Start
2020-07-01
Budget End
2023-06-30
Support Year
Fiscal Year
2020
Total Cost
$443,334
Indirect Cost
Name
University of Notre Dame
Department
Type
DUNS #
City
Notre Dame
State
IN
Country
United States
Zip Code
46556