This project develops a technological infrastructure for deploying and using ad hoc airborne video sensor (AVS) networks. In particular, autonomous blimps are used as the airborne platform, and are equipped with a compact processing unit, sophisticated camera assembly, and certain sensing devices. The AVS network can be envisioned being used for detection of unusual activities in emergency and disaster control situations, monitoring of unplanned events, etc.

One of the main research components involves development of efficient algorithms for various versions of the video coverage problems in the context of the AVS network. The second main research component of the research involves declarative representation and automated evaluation of high-level video activities. In addition to above, the project also entails research issues that arise in the areas of robotics and computer vision. An important component of the work is development of two testbeds - one with a large number of autonomous indoor blimps with a modest camera set-up, and the other with a smaller number of larger autonomous outdoor blimps with a sophisticated camera set-up.

The broader impact of this project is new applications that the AVS network facilitates, especially those requiring response to unplanned emergencies and threats. The project also contributes to education and involvement of minority students in advances to science and engineering. The results of the project are disseminated over the web at www.cs.sunysb.edu/~hgupta/airborne.

Project Report

In this project, we addressed research and development of a technological infrastructure for deploying and using ad hoc airbone video sensor network. In particular, we considered use of autonomous blimps as our airborne platform, while equipping each blimp with a compact processing unit, wireless interface, sophisticated camera, GPS, sensing devices, and batteries. We envisioned the network being used for detection of unusual activities in emergency and disaster control situations, monitoring of unplanned events, etc. We emphasize the scenarios where there is no pre-existing static camera infrastructure, requiring an immediate ad hoc deployment. In the context of the above vision, the research activities in this project focussed along the following broad directions: (i) Development of a distributed blimp controller and intelligent supervisory controllers, and its implementation and testing on mobile robots, (ii) Video coverage algorithms (selection and positioning of cameras to cover a given area), (iii) High-level specification of video activities (i.e., developing a logic programming framework to specifiy video activities at a high-level), (iv) Computer vision techniques (in particular, techniques for robust tracking and detection of objects using cameras), and (v) applications of our techniques in other domains (such as capacity optimization in cellular networks, pricing schemes in smart grids, and data preservation in sensor networks).

Agency
National Science Foundation (NSF)
Institute
Division of Computer and Network Systems (CNS)
Application #
0721701
Program Officer
Joseph Lyles
Project Start
Project End
Budget Start
2007-09-01
Budget End
2012-08-31
Support Year
Fiscal Year
2007
Total Cost
$528,000
Indirect Cost
Name
State University New York Stony Brook
Department
Type
DUNS #
City
Stony Brook
State
NY
Country
United States
Zip Code
11794