AI for Disaster Rapid Damage Assessment from Microblogs


  • Muhammad Imran Qatar Computing Research Institute, Doha, Qatar
  • Umair Qazi Qatar Computing Research Institute, Doha, Qatar
  • Ferda Ofli Qatar Computing Research Institute, Doha, Qatar
  • Steve Peterson Montgomery County Community Emergency Response Team, Maryland, USA
  • Firoj Alam Qatar Computing Research Institute, Doha, Qatar



Social Media, Image Processing, Damage Assessment, Artificial Intelligence


Formal response organizations perform rapid damage assessments after natural and human-induced disasters to measure the extent of damage to infrastructures such as roads, bridges, and buildings. This time-critical task, when performed using traditional approaches such as experts surveying the disaster areas, poses serious challenges and delays response. This paper presents an AI-based system that leverages citizen science to collect damage images reported on social media and perform rapid damage assessment in real-time. Several image processing models in the system tackle non-trivial challenges posed by social media as a data source, such as high-volume of redundant and irrelevant content. The system determines the severity of damage using a state-of-the-art computer vision model. Together with a response organization in the US, we deployed the system to identify damage reports during a major real-world disaster. We observe that almost 42% of the images are unique, 28% relevant, and more importantly, only 10% of them contain either mild or severe damage. Experts from our partner organization provided feedback on the system's mistakes, which we used to perform additional experiments to retrain the models. Consequently, the retrained models based on expert feedback on the target domain data helped us achieve significant performance improvements.




How to Cite

Imran, M., Qazi, U., Ofli, F., Peterson, S., & Alam, F. (2022). AI for Disaster Rapid Damage Assessment from Microblogs. Proceedings of the AAAI Conference on Artificial Intelligence, 36(11), 12517-12523.