Disaster Impacts Surveillance from Social Media with Topic Modeling and Feature Extraction: Case of Hurricane Harvey

利用主题建模和特征提取技术从社交媒体监测灾害影响:以哈维飓风为例

阅读:1

Abstract

Twitter can supply useful information on infrastructure impacts to the emergency managers during major disasters, but it is time consuming to filter through many irrelevant tweets. Previous studies have identified the types of messages that can be found on social media during disasters, but few solutions have been proposed to efficiently extract useful ones. We present a framework that can be applied in a timely manner to provide disaster impact information sourced from social media. The framework is tested on a well-studied and data-rich case of Hurricane Harvey. The procedures consist of filtering the raw Twitter data based on keywords, location, and tweet attributes, and then applying the latent Dirichlet allocation (LDA) to separate the tweets from the disaster affected area into categories (topics) useful to emergency managers. The LDA revealed that out of 24 topics found in the data, nine were directly related to disaster impacts—for example, outages, closures, flooded roads, and damaged infrastructure. Features such as frequent hashtags, mentions, URLs, and useful images were then extracted and analyzed. The relevant tweets, along with useful images, were correlated at the county level with flood depth, distributed disaster aid (damage), and population density. Significant correlations were found between the nine relevant topics and population density but not flood depth and damage, suggesting that more research into the suitability of social media data for disaster impacts modeling is needed. The results from this study provide baseline information for such efforts in the future.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。