In today's competitive production era, the ability to identify and track important objects in a near real-time manner is greatly desired among manufacturers who are moving towards the streamline production. Manually keeping track of every object in a complex manufacturing plant is infeasible; therefore, an automatic system of that functionality is greatly in need. This study was motivated to develop a Mask Region-based Convolutional Neural Network (Mask RCNN) model to semantically segment objects and important zones in manufacturing plants. The Mask RCNN was trained through transfer learning that used a neural network (NN) pre-trained with the MS-COCO dataset as the starting point and further fine-tuned that NN using a limited number of annotated images. Then the Mask RCNN model was modified to have consistent detection results from videos, which was realized through the use of a two-staged detection threshold and the analysis of the temporal coherence information of detected objects. The function of object tracking was added to the system for identifying the misplacement of objects. The effectiveness and efficiency of the proposed system were demonstrated by analyzing a sample of video footages.

Meeting Name

25th International Conference on Production Research Manufacturing Innovation: Cyber Physical Manufacturing, ICPR 2019 (2019: Aug. 9-14, Chicago, IL)


Computer Science

Research Center/Lab(s)

Center for Research in Energy and Environment (CREE)


This work was supported by the National Science Foundation (NSF) grant CMMI-1646162 on cyber-physical systems.

Keywords and Phrases

Mask RCNN; Object Detection; Temporal Coherence; Transfer Learning; Two-Staged Detection Threshold

International Standard Serial Number (ISSN)


Document Type

Article - Conference proceedings

Document Version

Final Version

File Type





© 2019 The Authors, All rights reserved.

Creative Commons Licensing

Creative Commons License
This work is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 4.0 License.

Publication Date

01 Aug 2019