Towards Nonparametric Topological Layers in Neural Networks

被引:0
作者
Shen, Gefei [1 ]
Zhao, Dongfang [1 ]
机构
[1] Univ Washington, Seattle, WA 98195 USA
来源
ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PT III, PAKDD 2024 | 2024年 / 14647卷
基金
美国国家科学基金会;
关键词
Applied topology; Nonparametric learning; Neural networks;
D O I
10.1007/978-981-97-2259-4_7
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Various topological techniques and tools have been applied to neural networks in terms of network complexity, explainability, and performance. One fundamental assumption of this line of research is the existence of a global (Euclidean) coordinate system upon which the topological layer is constructed. Despite promising results, such a topologization method has yet to be widely adopted because the parametrization of a topologization layer takes a considerable amount of time and lacks a theoretical foundation, leading to suboptimal performance and lack of explainability. This paper proposes a learnable topological layer for neural networks without requiring an Euclidean space. Instead, the proposed construction relies on a general metric space, specifically a Hilbert space that defines an inner product. As a result, the parametrization for the proposed topological layer is free of user-specified hyperparameters, eliminating the costly parametrization stage and the corresponding possibility of suboptimal networks. Experimental results on three popular data sets demonstrate the effectiveness of the proposed approach.
引用
收藏
页码:91 / 102
页数:12
相关论文
共 26 条
  • [1] Adams H, 2017, J MACH LEARN RES, V18
  • [2] Brüel-Gabrielsson R, 2020, PR MACH LEARN RES, V108, P1553
  • [3] Bubenik P, 2015, J MACH LEARN RES, V16, P77
  • [4] Carrière M, 2021, PR MACH LEARN RES, V139
  • [5] Chen C, 2019, PR MACH LEARN RES, V89
  • [6] Edelsbrunner H., 2010, COMPUTATIONAL TOPOLO
  • [7] github, KMNIST Dataset
  • [8] github, FashionMNIST Dataset
  • [9] Hofer C, 2017, ADV NEUR IN, V30
  • [10] Kim K., 2020, P NEURIPS FEB, V33, P15965