Transmission-Guided Bayesian Generative Model for Smoke Segmentation
Keywords:Computer Vision (CV), Machine Learning (ML)
AbstractSmoke segmentation is essential to precisely localize wildﬁre so that it can be extinguished in an early phase. Although deep neural networks have achieved promising results on image segmentation tasks, they are prone to be overconﬁdent for smoke segmentation due to its non-rigid shape and transparent appearance. This is caused by both knowledge level uncertainty due to limited training data for accurate smoke segmentation and labeling level uncertainty representing the difﬁculty in labeling ground-truth. To effectively model the two types of uncertainty, we introduce a Bayesian generative model to simultaneously estimate the posterior distribution of model parameters and its predictions. Further, smoke images suffer from low contrast and ambiguity, inspired by physics-based image dehazing methods, we design a transmission-guided local coherence loss to guide the network to learn pair-wise relationships based on pixel distance and the transmission feature. To promote the development of this ﬁeld, we also contribute a high-quality smoke segmentation dataset, SMOKE5K, consisting of 1,400 real and 4,000 synthetic images with pixel-wise annotation. Experimental results on benchmark testing datasets illustrate that our model achieves both accurate predictions and reliable uncertainty maps representing model ignorance about its prediction. Our code and dataset are publicly available at: https://github.com/redlessme/Transmission-BVM.
How to Cite
Yan, S., Zhang, J., & Barnes, N. (2022). Transmission-Guided Bayesian Generative Model for Smoke Segmentation. Proceedings of the AAAI Conference on Artificial Intelligence, 36(3), 3009-3017. https://doi.org/10.1609/aaai.v36i3.20207
AAAI Technical Track on Computer Vision III