[edit]
Remote Wildfire Detection using Multispectral Satellite Imagery and Vision Transformers
Proceedings of the 15th Asian Conference on Machine Learning, PMLR 222:1135-1150, 2024.
Abstract
Wildfires pose a significant and recurring challenge in North America, impacting both human and natural environments. The size and severity of wildfires in the region have been increasing in recent years, making it a pressing concern for communities, ecosystems, and the economy. The accurate and timely detection of active wildfires in remote areas is crucial for effective wildfire management and mitigation efforts. In this research paper, we propose a robust approach for detecting active wildfires using multispectral satellite imagery by leveraging vision transformers and a vast repository of landsat-$8$ satellite data with a $30$m spatial resolution in North America. Our methodology involves experimenting with vision transformers and deep convolutional neural networks for wildfire detection in multispectral satellite images. We compare the capabilities of these two architecture families in detecting wildfires within the multispectral satellite imagery. Furthermore, we propose a novel u-shape vision transformer that effectively captures spatial dependencies and learns meaningful representations from multispectral images, enabling precise discrimination between wildfire and non-wildfire regions. To evaluate the performance of our approach, we conducted experiments on a comprehensive dataset of wildfire incidents. The results demonstrate the effectiveness of the proposed method in accurately detecting active wildfires with an \textit{Dice Score or F$1$} of $%90.05$ and \textit{Recall} of $%89.61$ . Overall, our research presents a promising approach for leveraging vision transformers for multispectral satellite imagery to detect remote wildfires.