Spatio-Temporal Frequent Itemset Mining on Web Data

被引:6
|
作者
Aggarwal, Apeksha [1 ]
Toshniwal, Durga [1 ]
机构
[1] Indian Inst Technol Roorkee, Dept CSE, Roorkee, Uttar Pradesh, India
来源
2018 18TH IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOPS (ICDMW) | 2018年
关键词
Spatio-temporal; frequent pattern; association rule; time; location; ASSOCIATION RULES;
D O I
10.1109/ICDMW.2018.00166
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Web generates enormous volumes of spatiotemporal data every second. Such data includes transactional data on which association rule mining can he perliamed. Applications includes fraud detection, consumer purchase pattern identification, recommendation systems etc. Essence of spatiotemporal information alongwith the transactional data comes from the fact that the association rules or frequent patterns in the transactions are highly determined by the location and time of the occurrence of that transaction. For example, customer purchase of product depends upon the season and location of buying that product. To extract frequent patterns from such large databases, most existing algorithms demands enormous amounts of resources. The present work proposes a spatiotemporal association rule mining algorithm using hashing, to facilitate reduced memory access time and storage space. Hash based search technique is used to fasten the memory access by directly accessing the required spatio-temporal information from the schema. There are a numerous hash based search techniques that can be used. But to reduce collision, direct address hashing is focused upon primarily in this work. However, in future we plan to extend our results over different search techniques. Our results are compared with exiting Spatio-Temporal Apriori algorithm, which is one of the established association rule mining algorithm. Furthermore, experiments are demonstrated on several synthetically generated and web based datasets. Subsequently, a comparison over different datasets is given. Our algorithm shows improved results when evaluated over several metrics such as support of frequent itemsets and percentage gain in reduced memory access time. In future we plan to extend this work to various benchmark datasets.
引用
收藏
页码:1160 / 1165
页数:6
相关论文
共 50 条
  • [31] Mapping Spatio-Temporal Disaster Data into MongoDB
    Widyani, Yani
    Laksmiwati, Rira
    Bangun, Elia Dolaciho
    PROCEEDINGS OF 2016 INTERNATIONAL CONFERENCE ON DATA AND SOFTWARE ENGINEERING (ICODSE), 2016,
  • [32] Indexing Historical Spatio-Temporal Data in the Cloud
    Zhang, Chong
    Chen, Xiaoying
    Ge, Bin
    Xiao, Weidong
    PROCEEDINGS 2015 IEEE INTERNATIONAL CONFERENCE ON BIG DATA, 2015, : 1765 - 1774
  • [33] Spatio-temporal data classification using CVNNs
    Zahradnik, Jakub
    Skrbek, Miroslav
    SIMULATION MODELLING PRACTICE AND THEORY, 2013, 33 : 81 - 88
  • [34] SILKNOWViz: Spatio-Temporal Data Ontology Viewer
    Sevilla, Javier
    Portales, Cristina
    Gimeno, Jesus
    Sebastian, Jorge
    COMPUTATIONAL SCIENCE - ICCS 2019, PT V, 2019, 11540 : 97 - 109
  • [35] Spatio-temporal evaluation matrices for geospatial data
    Triglav, Joc
    Petrovic, Dusan
    Stopar, Bojan
    INTERNATIONAL JOURNAL OF APPLIED EARTH OBSERVATION AND GEOINFORMATION, 2011, 13 (01): : 100 - 109
  • [36] Spatio-temporal Outlier Detection in Precipitation Data
    Wu, Elizabeth
    Liu, Wei
    Chawla, Sanjay
    KNOWLEDGE DISCOVERY FROM SENSOR DATA, 2010, 5840 : 115 - 133
  • [37] ENDURANTISM AND SPATIO-TEMPORAL EXTENSION
    Rossi, Carlo
    IDEAS Y VALORES, 2023, 72 (183) : 121 - 143
  • [38] Spatio-Temporal Scale Selection in Video Data
    Tony Lindeberg
    Journal of Mathematical Imaging and Vision, 2018, 60 : 525 - 562
  • [39] CUPID: An efficient spatio-temporal data engine
    Wu, Hang
    Wang, Bo
    Zhang, Ming
    Li, Guanyao
    Li, Ruiyuan
    Liu, Yang
    FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2024, 161 : 531 - 544
  • [40] Spatio-Temporal Scale Selection in Video Data
    Lindeberg, Tony
    JOURNAL OF MATHEMATICAL IMAGING AND VISION, 2018, 60 (04) : 525 - 562