Image texture analysis enhances classification of fire extent and severity using Sentinel 1 and 2 satellite imagery

Rebecca Kate Gibson*, Anthea Mitchell, Hsing-Chung Chang

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

10 Citations (Scopus)
137 Downloads (Pure)

Abstract

Accurate and reliable mapping of fire extent and severity is critical for assessing the impact of fire on vegetation and informing post-fire recovery trajectories. Classification approaches that combine pixel-wise and neighbourhood statistics including image texture derived from high-resolution satellite data may improve on current methods of fire severity mapping. Texture is an innate property of all land cover surfaces that is known to vary between fire severity classes, becoming increasingly more homogenous as fire severity increases. In this study, we compared candidate backscatter and reflectance indices derived from Sentinel 1 and Sentinel 2, respectively, together with grey-level-co-occurrence-matrix (GLCM)-derived texture indices using a random forest supervised classification framework. Cross-validation (for which the target fire was excluded in training) and target-trained (for which the target fire was included in training) models were compared to evaluate performance between the models with and without texture indices. The results indicated that the addition of texture indices increased the classification accuracies of severity for both sensor types, with the greatest improvements in the high severity class (23.3%) for the Sentinel 1 and the moderate severity class (17.4%) for the Sentinel 2 target-trained models. The target-trained models consistently outperformed the cross-validation models, especially with regard to Sentinel 1, emphasising the importance of local training data in capturing post-fire variation in different forest types and severity classes. The Sentinel 2 models more accurately estimated fire extent and were improved with the addition of texture indices (3.2%). Optical sensor data yielded better results than C-band synthetic aperture radar (SAR) data with respect to distinguishing fire severity and extent. Successful detection using C-band data was linked to significant structural change in the canopy (i.e., partial-complete canopy consumption) and is more successful over sparse, low-biomass forest. Future research will investigate the sensitivity of longer-wavelength (L-band) SAR regarding fire severity estimation and the potential for an integrated fire-mapping system that incorporates both active and passive remote sensing to detect and monitor changes in vegetation cover and structure.

Original languageEnglish
Article number3512
Pages (from-to)1-21
Number of pages21
JournalRemote Sensing
Volume15
Issue number14
DOIs
Publication statusPublished - 12 Jul 2023

Bibliographical note

Copyright the Author(s) 2023. Version archived for private and non-commercial use with the permission of the author/s and according to publisher conditions. For further rights please contact the publisher.

Keywords

  • Sentinel 1
  • Sentinel 2
  • fire extent
  • fire severity
  • grey-level co-occurrence matrix
  • optical
  • synthetic aperture radar
  • texture

Fingerprint

Dive into the research topics of 'Image texture analysis enhances classification of fire extent and severity using Sentinel 1 and 2 satellite imagery'. Together they form a unique fingerprint.

Cite this