Description
This project introduces an inspection method using a deep neural network to detect the crack and spalling defects on concrete structures performed by a wall-climbing robot. First, we create a pixel-level semantic dataset which includes 820 labeled images. Second, we propose an inspection method to obtain 3D metric measurement by using an RGB-D camera-based visual simultaneous localization and mapping (SLAM), which is able to generate pose coupled key-frames with depth information. Therefore, the semantic inspection results can be registered in the concrete structure 3D model for condition assessment and monitoring. Third, we present our new generation wall-climbing robot to perform the inspection task on both horizontal and vertical surfaces.
Presentation Date
14 Aug 2018, 10:00 am - 5:00 pm
Meeting Name
INSPIRE-UTC 2018 Annual Meeting
Department(s)
Civil, Architectural and Environmental Engineering
Document Type
Poster
Document Version
Final Version
File Type
text
Language(s)
English
Included in
Deep Semantic 3D Visual Metric Reconstruction Using Wall-Climbing Robot
This project introduces an inspection method using a deep neural network to detect the crack and spalling defects on concrete structures performed by a wall-climbing robot. First, we create a pixel-level semantic dataset which includes 820 labeled images. Second, we propose an inspection method to obtain 3D metric measurement by using an RGB-D camera-based visual simultaneous localization and mapping (SLAM), which is able to generate pose coupled key-frames with depth information. Therefore, the semantic inspection results can be registered in the concrete structure 3D model for condition assessment and monitoring. Third, we present our new generation wall-climbing robot to perform the inspection task on both horizontal and vertical surfaces.
Comments
This work is partially supported by University Transportation Center on INSpecting and Preserving Infrastructure through Robotic Exploration (INSPIRE Center) with USA Federal HighWay Administration (FHWA) grant FAIN 69A3551747126.