This proposal identifies a pathway to distributed pattern recognition through parallelization of a particular contour identification algorithm and decentralized data collection using formations of robots. Such a system recognizes patterns in the data it collects autonomously. Anticipated applications range from homeland security, to emergency response, scientific exploration and environmental monitoring.
Traditional mobile sensor networks are based on an architecture in which some minimal signal processing is performed on the sensing nodes, while the bulk of information is directed to a network sink for processing and interpretation. The hypothesis here is that the same communication infrastructure that enables motion coordination in formation of robots can be exploited for distributed processing of sensor data and autonomous pattern recognition without human intervention. Thus, information is interpreted in a distributed fashion and without dependence on the capabilities of specialized individual nodes. This method brings forward a robust and autonomous system which can inherently tolerate node and network failures and exhibits collective intelligence in the form of group associative memory.
Technical challenges to be overcome are the development of decentralized and provably convergent cooperative motion control designs which can enable targeted data collection, and the scalable implementation of a pattern recognition algorithm based on Dirichle Laplacians along with its integration with spatially distributed Hopfield neural networks. The complete system will be demonstrated by an experimental test-bed with mobile robots capable of recognizing noisy, variable shapes on the laboratory floor. Outreach activities will include undergraduate research and summer programs for secondary school teachers.
This project aimed at developing the algorithms and technology for a mobile sensor network tasked with autonomously identifying patterns in the data it collects. Potential applications of such a function can be identified in surveillance and reconnaissance, search and rescue, underwater archeology, and planetary exploration. To realize this goal, we needed to understand better how robotic sensor platform motion, information flow, data management and processing, and pattern recognition interplay with each other in a distributed actuation and computation environment. The envisioned sensor network system consisted of a number of small sensor-bearing mobile robots that were able to exchange information with each other through local wireless network connections. These sensor nodes could move and autonomously fall into specific vehicle formations that enable them to cover fully a given area of interest in a coordinated fashion. Doing so, each one of these nodes would end up collecting a body of sensor data which offers partial view of the data landscape that the whole sensor ensample was supposed to cover. The idea was that instead of the individual sensors simply dumping their data into a central network node for off-line processing, they process information in a distributed way, based only on their own data and those of their closest network neighbors. Without any single network node having to have the whole data picture, they would collectively conclude as to whether there is a pattern in the data they have collected, even if individual sensors could not have by themselves "seen" the complete pattern, but rather bits and pieces of it. It is information exchange between them that essentially enables them all to "see" through each other’s "eyes." Having implemented both information collection, as well as processing, in a distributed manner promised robustness with respect to sensor noise as well as platform failures. With the resources provided through this award, a proof-of-concept small-scale physical testbed was developed. The testbed consisted of wheeled mobile robots carrying down-facing cameras, which when moved in a tight formation could take successive pictures of the floor on which they were driving on, and collectively identify large-scale shapes that were drawn on the floor. The system was shown to be reasonably effective in identifying such shapes in a distributed manner. In an effort to assess the potential of this technology for underwater surveys and archeology, and in collaboration with a colleague specializing in robotic oceanography, we tested the identification and pattern recognition algorithms against very challenging underwater sonar and visual imagery. Transitioning from relatively high-resolution land-based imagery to low-resolution underwater data revealed fragility in the machine learning methodological components that had originally been proposed and integrated, and together with other insights outlined in the following paragraph, motivated a new approach that eventually brought fruit. The new approach leads to a considerably more robust identification and recognition system, which performed admirably against a large body of grainy, noisy, low-resolution underwater imagery recording distributions of north atlantic scallop, collected by field-deployed autonomous underwater vehicles operated by our oceanographer collaborator. One of the unanticipated insights we obtained was that, despite all the hype and emphasis on distributed sensing and control, decentralization brings benefits under the right conditions. What is sometimes overlooked in the context of distributed control and sensor networks is the cost of passing information around. This cost is not so much in terms of power and bandwidth—which have been at the center of a significant body of recent relevant research; rather it is understood here in terms of time lag and delays, which not entirely attributed to bandwidth but also to computation. While we have demonstrated that integrated distributed sensory data processing and decentralized, spatially distributed, collective pattern recognition is feasible, depending on the particular computation/communication budget and risk tolerance, it may not always be preferable to centralized system architectures in terms of performance. The very nature of research is such that it is expected to give answers leading to new questions: what exactly are the tradeoffs between decentralization, risk, and performance, under a given set of computational, sensing, and networking parameters? What aspects, features, and attributes should we identify with decision making algorithms that are more likely to perform robustly over a wide range of application domains and based on data of variable degree of fidelity and resolution? In addition to showing us that coordinated swarms of robotic sensor nodes can identify autonomously patterns in their collected data, either by trying to "hear the shape of their drum" or otherwise, this research activity positions us better to go after some of the fundamental questions identified above.