Forest Fires Are a Huge Ecological Hazard, and Smoke is an Early Characteristic of Forest Fires. Smoke is Present Only in a Tiny Region in Images that Are Captured in the Early Stages of Smoke Occurrence or When the Smoke is Far from the Camera. Furthermore, Smoke Dispersal is Uneven, and the Background Environment is Complicated and Changing, Thereby Leading to Inconspicuous Pixel-Based Features that Complicate Smoke Detection. in This Paper, We Propose a Detection Method Called Multioriented Detection based on a Value Conversion-Attention Mechanism Module and Mixed-NMS (MVMNet). First, a Multioriented Detection Method is Proposed. in Contrast to Traditional Detection Techniques, This Method Includes an Angle Parameter in the Data Loading Process and Calculates the Target's Rotation Angle using the Classification Prediction Method, Which Has Reference Significance for Determining the Direction of the Fire Source. Then, to Address the Issue of Inconsistent Image Input Size While Preserving More Feature Information, Softpool-Spatial Pyramid Pooling (Soft-SPP) is Proposed. Next, We Construct a Value Conversion-Attention Mechanism Module (VAM) based on the Joint Weighting Strategy in the Horizontal and Vertical Directions, Which Can Specifically Extract the Colour and Texture of the Smoke. Ultimately, the DIoU-NMS and Skew-NMS Hybrid Nonmaximum Suppression Methods Are Employed to Address the Issues of Smoke False Detection and Missed Detection. Experiments Are Conducted using the Homemade Forest Fire Multioriented Detection Dataset, and the Results Demonstrate that Compared to the Traditional Detection Method, Our Model's MAP Reaches 78.92%, MAP 50 Reaches 88.05%, and FPS Reaches 122.


Civil, Architectural and Environmental Engineering


National Natural Science Foundation of China, Grant kq2014160

Keywords and Phrases

Forest fire smoke detection; Mixed-NMS; Multioriented; Softpool-spatial pyramid pooling; Value conversion-attention mechanism module

International Standard Serial Number (ISSN)


Document Type

Article - Journal

Document Version


File Type





© 2023 Elsevier, All rights reserved.

Publication Date

06 Apr 2022