A Computer Vision System for Terrain Recognition and Object Detection Tasks in Mining and Construction Environments


Recent studies towards dragline excavation efficiency have focused on incrementally achieving automation of the entire excavation cycle. Initial efforts resulted in the development of an automated dragline swing system, which optimizes the swing phase time. However, the system still requires human operation for collision avoidance. For full dragline autonomy, a machine vision system is needed for collision prevention and big rock handling during the 'swinging' and 'digging' phases of the excavation operation. Previous attempts in this area focused on collision avoidance vision models which estimated the location of the bucket in space in real-time. However, these previous models use image segmentation methods that are neither scalable nor multi-purpose. In this study, a scalable and multi-purpose vision model has been developed for draglines using Convolutional Neural Networks. This vision system averages 82.6% classification accuracy and 91% detection in collision avoidance. It also achieves an 87.32% detection rate in bucket pose estimation tasks. In addition, it averages 80.9% precision and 91.3% recall performance across terrain recognition and oversized rock detection tasks. With minimal modification, the proposed vision system can be adjusted for other automated excavators.

Meeting Name

2019 SME Annual Conference and Expo and CMA 121st National Western Mining Conference (2019: Feb. 24-27, Denver, CO)


Mining Engineering

Keywords and Phrases

Convolutional Neural Network; Deep Learning; Dragline Bucket; Earthmoving Equipment; Machine Learning; Machine Vision; Object Detection; oversized Rock; Surface Mining

International Standard Book Number (ISBN)


Document Type

Article - Conference proceedings

Document Version


File Type





© 2019 Society for Mining, Metallurgy and Exploration (SME), All rights reserved.

Publication Date

01 Feb 2019

This document is currently not available here.