Abstract

This project developed two wall-climbing robots and two omni-directional ground robots for automated data collection on both horizontal and vertical surfaces of concrete structures. These robots are equipped with non-destructive evaluation (NDE) sensors, including ground penetrating radar (GPR), impact sounding (IS) and impact echo (IE) devices. Utilizing a vision-based positioning system powered by simultaneous localization and mapping (V-SLAM), the robots tag NDE data with real-time pose information, significantly accelerating the data collection process and enhancing the accuracy of defect mapping. While effective at detecting shallow defects, impact sounding is highly sensitive to ambient noise and unsuitable for identifying deeper flaws. IENet, a machine learning model that delivers superior classification accuracy and demonstrates strong generalization was developed to detect subsurface defects from IE data. Additionally, a dual-chamber General Wall-Climbing Robot (GWCR) was engineered to traverse gaps and ditches on vertical surfaces. It features a universal sliding rail for easy interchange of NDE sensors, enabling flexibility in inspection tasks. Extensive field tests validated the GWCR’s capabilities.

Department(s)

Civil, Architectural and Environmental Engineering

Research Center/Lab(s)

INSPIRE - University Transportation Center

Sponsor(s)

Office of the Assistant Secretary for Research and Technology U.S. Department of Transportation 1200 New Jersey Avenue, SE Washington, DC 20590

Comments

Principal Investigator: Jizhong Xiao, Ph. D.

Grant #: USDOT # 69A3551747126

Grant Period: 10/01/2022 - 09/30/2024

Project Period: 10/01/2022 - 06/30/2024

The investigation was conducted under the auspices of the INSPIRE University Transportation Center.

Keywords and Phrases

Wall-climbing robot, impact echo, neural network

Report Number

INSPIRE-018

Document Type

Technical Report

Document Version

Final Version

File Type

text

Language(s)

English

Rights

© 2025 Missouri University of Science and Technology, All rights reserved.

Publication Date

May 30, 2024

Share

 
COinS