Learning with fuzzy hypergraphs: A topical approach to query-oriented text summarization

被引:21
|
作者
Van Lierde, Hadrien [1 ]
Chow, Tommy W. S. [1 ]
机构
[1] City Univ Hong Kong, Dept Elect Engn, Kowloon Tong, 83 Tat Chee Av, Hong Kong, Peoples R China
关键词
Automatic text summarization; Fuzzy graphs; Probabilistic topic models; Hierarchical Dirichlet process; Personalized PageRank; Submodular set functions; RANKING;
D O I
10.1016/j.ins.2019.05.020
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Existing graph-based methods for extractive document summarization represent sentences of a corpus as the nodes of a graph in which edges depict relationships of lexical similarity between sentences. This approach fails to capture semantic similarities between sentences when they express a similar information but have few words in common and are thus lexically dissimilar. To overcome this issue, we propose to extract semantic similarities based on topical representations of sentences. Inspired by the Hierarchical Dirichlet Process, we propose a topic model to infer topic distributions of sentences. As each topic defines a semantic connection among sentences with a certain degree of membership for each sentence, we propose a fuzzy hypergraph model in which nodes are sentences and fuzzy hyperedges are topics. To produce an informative summary, we extract a set of sentences from the corpus by simultaneously maximizing their relevance to a user-defined query, their centrality in the fuzzy hypergraph and their coverage of topics present in the corpus. We formulate an algorithm building on the theory of submodular functions to solve the associated optimization problem. A thorough comparative analysis with other graph-based summarizers demonstrates the superiority of our method in terms of content coverage of the summaries. (C) 2019 Elsevier Inc. All rights reserved.
引用
收藏
页码:212 / 224
页数:13
相关论文
共 50 条
  • [31] Reinforcement learning for query-oriented routing indices in unstructured peer-to-peer networks
    Shi, Cong
    Meng, Shicong
    Liu, Yuanjie
    Han, Dingyi
    Yu, Yong
    SIXTH IEEE INTERNATIONAL CONFERENCE ON PEER-TO-PEER COMPUTING, PROCEEDINGS, 2006, : 267 - +
  • [32] RETRACTION: Query-oriented topical influential users detection for top-k trending topics in twitter
    Gomasta, Sarmistha Sarna
    Dhali, Aditi
    Anwar, Md Musfique
    Sarker, Iqbal H.
    APPLIED INTELLIGENCE, 2025, 55 (02)
  • [33] Sentiment-oriented query-focused text summarization addressed with a multi-objective optimization approach
    Sanchez-Gomez, Jesus M.
    Vega-Rodriguez, Miguel A.
    Perez, Carlos J.
    APPLIED SOFT COMPUTING, 2021, 113
  • [34] ANALYSING FUZZY BASED APPROACH FOR EXTRACTIVE TEXT SUMMARIZATION
    Sharaff, Aakanksha
    Khaire, Amit Siddharth
    Sharma, Dimple
    PROCEEDINGS OF THE 2019 INTERNATIONAL CONFERENCE ON INTELLIGENT COMPUTING AND CONTROL SYSTEMS (ICCS), 2019, : 906 - 910
  • [35] Query-focused multi-document text summarization using fuzzy inference
    Agarwal, Raksha
    Chatterjee, Niladri
    JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2022, 42 (05) : 4641 - 4652
  • [36] Fuzzy evolutionary cellular learning automata model for text summarization
    Abbasi-ghalehtaki, Razieh
    Khotanlou, Hassan
    Esmaeilpour, Mansour
    SWARM AND EVOLUTIONARY COMPUTATION, 2016, 30 : 11 - 26
  • [37] Optimizing Persian Text Summarization Based on Fuzzy Logic Approach
    Kiyoumarsi, Farshad
    Esfahani, Fariba Rahimi
    COMPUTER COMMUNICATION AND MANAGEMENT, 2011, 5 : 264 - 269
  • [38] Automatic text summarization using a machine learning approach
    Neto, JL
    Freitas, AA
    Kaestner, CAA
    ADVANCES IN ARTIFICIAL INTELLIGENCE, PROCEEDINGS, 2002, 2507 : 205 - 215
  • [39] Extractive text summarization using deep learning approach
    Yadav A.K.
    Singh A.
    Dhiman M.
    Vineet
    Kaundal R.
    Verma A.
    Yadav D.
    International Journal of Information Technology, 2022, 14 (5) : 2407 - 2415
  • [40] Arabic text summarization using deep learning approach
    Al-Maleh, Molham
    Desouki, Said
    JOURNAL OF BIG DATA, 2020, 7 (01)