A unified active learning framework for annotating graph data for regression task

被引:1
作者
Samoaa, Peter [1 ]
Aronsson, Linus [1 ]
Longa, Antonio [2 ]
Leitner, Philipp [3 ]
Chehreghani, Morteza Haghir [1 ]
机构
[1] Chalmers Univ Technol, Data Sci & AI, Gothenburg, Sweden
[2] Univ Trento, Trento, Italy
[3] Chalmers Univ Technol, Interact Design & Software Engn, Gothenburg, Sweden
基金
瑞典研究理事会;
关键词
Graph neural networks (GNNs); Active learning; Graphs-level regression; NETWORKS;
D O I
10.1016/j.engappai.2024.109383
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In many domains, effectively applying machine learning models requires a large number of annotations and labelled data, which might not be available in advance. Acquiring annotations often requires significant time, effort, and computational resources, making it challenging. Active learning strategies are pivotal in addressing these challenges, particularly for diverse data types such as graphs. Although active learning has been extensively explored for node-level classification, its application to graph-level learning, especially for regression tasks, is not well-explored. We develop a unified active learning framework specializing in graph annotating and graph-level learning for regression tasks on both standard and expanded graphs, which are more detailed representations. We begin with graph collection and construction. Then, we construct various graph embeddings (unsupervised and supervised) into a latent space. Given such an embedding, the framework becomes task agnostic and active learning can be performed using any regression method and query strategy suited for regression. Within this framework, we investigate the impact of using different levels of information for active and passive learning, e.g., partially available labels and unlabelled test data. Despite our framework being domain agnostic, we validate it on a real-world application of software performance prediction, where the execution time of the source code is predicted. Thus, the graph is constructed as an intermediate source code representation. We support our methodology with a real-world dataset to underscore the applicability of our approach. Our real-world experiments reveal that satisfactory performance can be achieved by querying labels for only a small subset of all the data. A key finding is that Graph2Vec (an unsupervised embedding approach for graph data) performs the best, but only when all train and test features are used. However, Graph Neural Networks (GNNs) are the most flexible embedding techniques when used for different levels of information with and without label access. In addition, we find that the benefit of active learning increases for larger datasets (more graphs) and when the graphs are more complex, which is arguably when active learning is the most important.
引用
收藏
页数:25
相关论文
共 50 条
[31]   Annotating Panic in Social Media using Active Learning, Transformers and Domain Knowledge [J].
Mitrovic, Sandra ;
Frisone, Fabio ;
Gupta, Suryam ;
Lucifora, Chiara ;
Carapic, Dragana ;
Schillaci, Carlo ;
Di Giovanni, Samuele ;
Singh, Ayushi .
2023 23RD IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOPS, ICDMW 2023, 2023, :1269-1278
[32]   Graph Active Learning at Subgraph Granularity [J].
Cao, Yunqi ;
Wang, Ziming ;
Chen, Haopeng .
2023 IEEE 35TH INTERNATIONAL CONFERENCE ON TOOLS WITH ARTIFICIAL INTELLIGENCE, ICTAI, 2023, :578-585
[33]   A Unified Approach on Active Learning Dual Supervision [J].
Chriswanto, Adrian ;
Pao, Hsing-Kuo ;
Leet, Yuh-Jye .
2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2019,
[34]   Affect Estimation in 3D Space Using Multi-Task Active Learning for Regression [J].
Wu, Dongrui ;
Huang, Jian .
IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2022, 13 (01) :16-27
[35]   A Novel Active Learning Regression Framework for Balancing the Exploration-Exploitation Trade-Off [J].
Elreedy, Dina ;
Atiya, Amir E. ;
Shaheen, Samir, I .
ENTROPY, 2019, 21 (07)
[36]   Active Learning for Imbalanced Ordinal Regression [J].
Ge, Jiaming ;
Chen, Haiyan ;
Zhang, Dongfang ;
Hou, Xiaye ;
Yuan, Ligang .
IEEE ACCESS, 2020, 8 :180608-180617
[37]   Active learning for logistic regression: an evaluation [J].
Andrew I. Schein ;
Lyle H. Ungar .
Machine Learning, 2007, 68 :235-265
[38]   Active learning for logistic regression: an evaluation [J].
Schein, Andrew I. ;
Ungar, Lyle H. .
MACHINE LEARNING, 2007, 68 (03) :235-265
[39]   An Active Learning Approach to Task Adaptation [J].
Wu, Ji ;
He, Zhiyang ;
Lv, Ping .
12TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION 2011 (INTERSPEECH 2011), VOLS 1-5, 2011, :2608-2611
[40]   GANDALF: Graph-based transformer and Data Augmentation Active Learning Framework with interpretable features for multi-label chest Xray classification [J].
Mahapatra, Dwarikanath ;
Bozorgtabar, Behzad ;
Ge, Zongyuan ;
Reyes, Mauricio .
MEDICAL IMAGE ANALYSIS, 2024, 93