The objective of this research is to explore computer vision-based monitoring methods, enabling the automatic and constant observation of construction workers for in-depth safety analysis. Taking into account the fact that the number of fatalities in construction remains the highest among all industries, and that approximately 80 to 90 percent of accidents are strongly associated with workers' unsafe behavior and acts, the automatic capture and systematic understanding of unsafe behavior has great potential to contribute to the reduction and prevention of injuries and fatalities in construction. Specifically, using video and image sequences, the proposed system estimates the 2D location of a human skeleton, computes the 3D location of body joints, and identifies the worker?s unsafe motions using machine learning techniques. To investigate the feasibility and potential of the proposed methods for behavior monitoring, several representative motions in traumatic (e.g., falls) and ergonomic (e.g., overexertion and repetitive motions) injuries are tested as a case study.
If successful, the findings of this research could lead to the prevention of injuries and fatalities in the construction industry by providing an in-depth understanding of human behavior and actions in terms of safety. Further, the motion analysis techniques developed in this project can be applied to diverse industries (e.g., manufacturing and shipbuilding) where labor is an important resource, providing a means to automatically collect data on human behavior and thus enabling its effective understanding. In addition, the education plan (e.g., the course that deals with understanding human behavior in safety, and the workshop planned for industry professionals and students) will provide effective education of future managers and engineers, both of whom will improve safety in the US workplace. Further, female and underrepresented students will be recruited and integrated into the planned research and education activities (e.g., safety seminars and workshops and interdisciplinary research participation opportunities).