RelKD 2023: International Workshop on Resource-Efficient Learning for Knowledge Discovery

被引:0
作者
Zhang, Chuxu [1 ]
Xu, Dongkuan [2 ]
Javaheripi, Mojan [3 ]
Mukherjee, Subhabrata [3 ]
Wu, Lingfei [4 ]
Xia, Yinglong [5 ]
Li, Jundong [6 ]
Jiang, Meng
Wang, Yanzhi [7 ]
机构
[1] Brandeis Univ, Waltham, MA 02254 USA
[2] North Carolina State Univ, Raleigh, NC USA
[3] Microsoft Res, Seattle, WA USA
[4] Pinterest, Seattle, WA USA
[5] Meta AI, Seattle, WA USA
[6] Univ Virginia, Charlottesville, VA USA
[7] Northeastern Univ, Boston, MA 02115 USA
来源
PROCEEDINGS OF THE 29TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2023 | 2023年
关键词
D O I
10.1145/3580305.3599228
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Modern machine learning techniques, especially deep neural networks, have demonstrated excellent performance for various knowledge discovery and data mining applications. However, the development of many of these techniques still encounters resource constraint challenges in many scenarios, such as limited labeled data (data-level), small model size requirements in real-world computing platforms (model-level), and efficient mapping of the computations to heterogeneous target hardware (system-level). Addressing all of these metrics is critical for the effective and efficient usage of the developed models in a wide variety of real systems, such as large-scale social network analysis, large-scale recommendation systems, and real-time anomaly detection. Therefore, it is desirable to develop efficient learning techniques to tackle challenges of resource limitations from data, model/algorithm, or (and) system/hardware perspectives. The proposed international workshop on "Resource-Efficient Learning for Knowledge Discovery (RelKD 2023)" will provide a great venue for academic researchers and industrial practitioners to share challenges, solutions, and future opportunities of resource-efficient learning.
引用
收藏
页码:5901 / 5902
页数:2
相关论文
empty
未找到相关数据