This research studies if computer vision can more effectively evaluate worker exposure and assess the associated risk for work related injuries than conventional methods. Current methods involve either observations or measurements using instruments attached to a worker's hands or arms. Observation is often considered too subjective or inaccurate, and instruments too invasive or time consuming for routine applications in industry. Automated job analysis potentially offers a more objective, accurate, repeatable, and efficient exposure assessment tool than observational analysis. Computer vision uses less resources than instruments attached to workers and does not interfere with production; can quantify more exposure variables and interactions; is suitable for long-term, direct reading exposure assessment; and offers animated data visualizations synchronized with video for identifying aspects of jobs needing interventions. This research leverages the research from coordinated multi-institutional prospective studies of upper limb work related MSD conducted between 2001 and 2010 that studied production and service workers from a variety of US industries, and used rigorous case-criteria and individual-level exposure assessments prospectively, including recording detailed videos of the work. Our study partners from the National Institute for Occupational Safety and Health, the Washington State Labor & Industries Safety & Health Assessment & Research for Prevention program and the University of California-San Francisco will provide task-level videos, associated exposure variable data, and prospective health outcomes for 1,649 workers. Exposure properties directly measured from videos of jobs and corresponding health outcomes from the prospective study database will establish dose-response relationships to translate into a prototype automated job analysis instrument. We build on our previous success in developing video marker-less hand motion algorithms for estimating the ACGIH hand activity level, and reliable video processing methods for hand tracking under challenging viewing conditions. This proposal will refine and develop additional video algorithms, and analyze the videos to extract exposure measures for repetition, posture, exertions, and their interactions. The video extracted exposure measures will be compared against conventional observational exposure measures made by our collaborators. Video and corresponding observational data will be merged with the prospective health outcomes data to evaluate dose-response and to develop and validate parsimonious exposure risk models for an automated direct reading repetitive motion instrument. We will test if automation has better predictive capability than observation and also consider the accuracy and utility of computer vision analysis against conventional job analysis for selected industrial jobs. This proposal addresses the NIOSH cross-sector programs in Musculoskeletal Disorders as well as in Exposure Assessment. This translational research is in concurrence with the Research to Practice (r2P) initiative by developing technology to disseminate knowledge from recent NIOSH sponsored prospective studies on MSDs.
Upperextremitymusculoskeletalinjuriesarecommoninhandintensiveworkinvolvinghighlyrepetitivemotions andexertionsandimposeasignificantsocioeconomicburdenandsubstantialpersonaltollonhealth, prosperityandwellbeing.Thisproposalinvestigatesapracticalapproachforexposureassessmentofjobsthat canbeimplementednon-invasivelyintheworkplaceandprovidedirectreadingquantitativemeasuresfor evaluation,preventionandcontrol.
Akkas, Oguz; Lee, Cheng Hsien; Hu, Yu Hen et al. (2017) Measuring exertion time, duty cycle and hand activity level for industrial tasks using computer vision. Ergonomics 60:1730-1738 |
Greene, Runyu L; Azari, David P; Hu, Yu Hen et al. (2017) Visualizing stressful aspects of repetitive motion tasks and opportunities for ergonomic improvements using computer vision. Appl Ergon 65:461-472 |
Akkas, Oguz; Lee, Cheng-Hsien; Hu, Yu Hen et al. (2016) Measuring elemental time and duty cycle using automated video processing. Ergonomics 59:1514-1525 |