Event summarization based on crowdsourced microblog data is a promising research area, and several researchers have recently focused on this field. However, these previous works fail to characterize the fine-grained evolution of an event and the rich correlations among posts. The semantic associations among the multi-modal data in posts are also not investigated as a means to enhance the summarization performance. To address these issues, this study presents CrowdStory, which aims to characterize an event as a fine-grained, evolutionary, and correlation-rich storyline. A crowd-powered event model and a generic event storyline generation framework are first proposed, based on which a multi-clue–based approach to fine-grained event summarization is presented. The implicit human intelligence (HI) extracted from visual contents and community interactions is then used to identify inter-clue associations. Finally, a cross-media mining approach to selective visual story presentation is proposed. The experiment results indicate that, compared with the state-of-the-art methods, CrowdStory enables fine-grained event summarization (e.g., dynamic evolution) and correctly identifies up to 60% strong correlations (e.g., causality) of clues. The cross-media approach shows diversity and relevancy in visual data selection.
To appear in Proc. ACM Interact. Mob. Wearable Ubiquitous Technology(IMWUT). Issue 3, /Ubicomp’17 .