A Table Method for Coded Target Decoding with Application to 3-D Reconstruction of Soil Specimens during Triaxial Testing


Photogrammetry-based method is gaining popularity in many fields. One of the main tasks of photogrammetry is to identify the homologous points in multiple images, which is commonly referred to as the corresponding problem. Coded targets are often placed on the surfaces of the targeted objects and have been widely used as a reliable method for solving the corresponding problem in photogrammetry for high-accuracy three-dimensional measurements. Automated recognition and identification of coded targets are of great importance in the coded target-based photogrammetry. However, false coded target identifications are inevitable due to large perspective distortion, unfavorable lighting conditions, and low-resolution, low-quality images, etc. As a result, manual corrections are often required, which are tedious, prone to error, and inefficient. In this paper, a faster R-CNN-based method has been proposed to recognize coded targets. Then, a table method has been developed to automatically identify and reject the falsely identified coded targets by taking advantages of the prior knowledge of the geometric arrangement of the coded targets. Based on that, missing coded targets can be recovered using either interpolation or extrapolation method. The effectiveness and accuracy of the proposed method are validated by implementing it into three-dimensional reconstruction of soil specimens during triaxial testing in geotechnical engineering. Experimental validation results indicate that the proposed method can achieve accurate and efficient coded target recognition and identification.


Civil, Architectural and Environmental Engineering

Keywords and Phrases

Coded Target Identification; Coded Target Recognition; Photogrammetry; Table Method; Three-Dimensional Reconstruction; Triaxial Test

International Standard Serial Number (ISSN)

1861-1133; 1861-1125

Document Type

Article - Journal

Document Version


File Type





© 2021 Springer, All rights reserved.

Publication Date

07 Oct 2021