This collaborative project between academics and industry provides a configurable and portable vision-based infrastructure for decentralized coordination of industrial trucks, also known as Automated Guided Vehicles (AGVs) in indoor environments. AGVs have the potential to revolutionize operations in areas such as manufacturing and distribution, health care, and military by efficiently accomplishing the mundane and often repetitive task of transporting materials in distribution and manufacturing. The research addresses limitations of recent successful introduction of commercial robot systems in distribution centers and manufacturing to transport items, but typically only to specific and restrictive environments. The research extends these technologies to environments that are quickly reconfigured by moving stations and objects around. Generic models, tools, and technologies are developed to actively capture the world with semantically labeled objects, actions and events, and to generate goals, priorities, and plans.

The solutions proposed in this research address the requirements of decentralized coordination, real-time environmental changes. The approach uses a set of distributed ceiling-mounted smart cameras with overlapping field-of-view for global view and coordination at the facility level, and cameras mounted on AGVs for short-range truck navigation. Multi-truck coordination is then framed as the problem of routing packets in a dynamic and hierarchical network where cameras represent routers and trucks represent packets. To address the complexity of image processing tasks, the hardware/software implementation utilizes an FPGA-based target platform from a previous NSF-Funded project, where low-level inherent parallel image processing tasks are mapped onto hardware while high-level reasoning is kept in software. This project will provide an interface formalism to specify component integration and system composition along with methods to optimize run-time and resource usage.

This research is expected to produce an infrastructure and methods for decentralized vision-based coordination of autonomous vehicles activities in a dynamically changing indoor environment. The developed infrastructure will enable manufacturing and distribution companies, including lean and agile entities, to optimize indoor transportation activities in existing arrangement, without modification of available infrastructure, and reduce labor and operating costs by redeploying employees to value-added roles. In addition, AGVs have the potential to enable autonomous mobile robot applications in numerous other unstructured environments, including hospitals, malls, retail stores, critical infrastructure, airports, schools, and sports venues. The project is conducted as a joint effort between the University of Arkansas in Fayetteville and R-Dex in Atlanta and will provide undergraduate and graduate students opportunities to perform their work in academic and industrial environments. The involvement of under-represented groups will be increased with the support of the University of Arkansas Engineering Career Awareness Program, and the University of Arkansas chapters of the National Society of Black Engineer (NSBE) and the Society of Hispanic Professional Engineers (SHPE).

Agency
National Science Foundation (NSF)
Institute
Division of Computer and Network Systems (CNS)
Type
Standard Grant (Standard)
Application #
1547934
Program Officer
Marilyn McClure
Project Start
Project End
Budget Start
2015-09-01
Budget End
2018-12-31
Support Year
Fiscal Year
2015
Total Cost
$305,557
Indirect Cost
Name
University of Arkansas at Fayetteville
Department
Type
DUNS #
City
Fayetteville
State
AR
Country
United States
Zip Code
72702