Temporal dependence within the workload of any computing or networking system has been widely recognized as a significant factor affecting performance. More specifically, burstiness, as a form of temporal dependency, is catastrophic for performance. Experiments have shown that burstiness in the arrival intensities or service demands in a single server system may result in user response times that are slower by several orders of magnitude. To this day, no analytic queueing models exist that explicitly capture burstiness.
The proposed research will provide a formalization of burstiness using autocorrelation which characterizes the temporal dependence structure in request flows. New analytic models that capture the performance effects of autocorrelation in queueing systems will be devised, and based on these, new resource allocation and scheduling policies will be developed.