On-manifold projected gradient descent

被引:0
作者
Mahler, Aaron [1 ]
Berry, Tyrus [2 ]
Stephens, Tom [1 ]
Antil, Harbir [2 ]
Merritt, Michael [1 ]
Schreiber, Jeanie [2 ]
Kevrekidis, Ioannis [3 ]
机构
[1] Teledyne Sci & Imaging LLC, Durham, NC 27713 USA
[2] George Mason Univ, Ctr Math & Artificial Intelligence, Fairfax, VA 22030 USA
[3] Johns Hopkins Univ, Chem & Biomol Engn & Appl Math & Stat, Baltimore, MD USA
来源
FRONTIERS IN COMPUTER SCIENCE | 2024年 / 6卷
关键词
diffusion maps; kernel methods; manifold learning; Nystrom approximation; adversarial attack; image classification; projected gradient descent;
D O I
10.3389/fcomp.2024.1274181
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
This study provides a computable, direct, and mathematically rigorous approximation to the differential geometry of class manifolds for high-dimensional data, along with non-linear projections from input space onto these class manifolds. The tools are applied to the setting of neural network image classifiers, where we generate novel, on-manifold data samples and implement a projected gradient descent algorithm for on-manifold adversarial training. The susceptibility of neural networks (NNs) to adversarial attack highlights the brittle nature of NN decision boundaries in input space. Introducing adversarial examples during training has been shown to reduce the susceptibility of NNs to adversarial attack; however, it has also been shown to reduce the accuracy of the classifier if the examples are not valid examples for that class. Realistic "on-manifold" examples have been previously generated from class manifolds in the latent space of an autoencoder. Our study explores these phenomena in a geometric and computational setting that is much closer to the raw, high-dimensional input space than what can be provided by VAE or other black box dimensionality reductions. We employ conformally invariant diffusion maps (CIDM) to approximate class manifolds in diffusion coordinates and develop the Nystrom projection to project novel points onto class manifolds in this setting. On top of the manifold approximation, we leverage the spectral exterior calculus (SEC) to determine geometric quantities such as tangent vectors of the manifold. We use these tools to obtain adversarial examples that reside on a class manifold, yet fool a classifier. These misclassifications then become explainable in terms of human-understandable manipulations within the data, by expressing the on-manifold adversary in the semantic basis on the manifold.
引用
收藏
页数:17
相关论文
共 29 条
[1]  
Athalye A, 2018, PR MACH LEARN RES, V80
[2]   Laplacian eigenmaps for dimensionality reduction and data representation [J].
Belkin, M ;
Niyogi, P .
NEURAL COMPUTATION, 2003, 15 (06) :1373-1396
[3]   Spectral Exterior Calculus [J].
Berry, Tyrus ;
Giannakis, Dimitrios .
COMMUNICATIONS ON PURE AND APPLIED MATHEMATICS, 2020, 73 (04) :689-770
[4]   CONSISTENT MANIFOLD REPRESENTATION FOR TOPOLOGICAL DATA ANALYSIS [J].
Berry, Tyrus ;
Sauer, Timothy .
FOUNDATIONS OF DATA SCIENCE, 2019, 1 (01) :1-38
[5]   Local kernels and the geometric structure of data [J].
Berry, Tyrus ;
Sauer, Timothy .
APPLIED AND COMPUTATIONAL HARMONIC ANALYSIS, 2016, 40 (03) :439-469
[6]   Variable bandwidth diffusion kernels [J].
Berry, Tyrus ;
Harlim, John .
APPLIED AND COMPUTATIONAL HARMONIC ANALYSIS, 2016, 40 (01) :68-96
[7]   DAPAS : Denoising Autoencoder to Prevent Adversarial attack in Semantic Segmentation [J].
Cho, Seungju ;
Jun, Tae Joon ;
Oh, Byungsoo ;
Kim, Daeyoung .
2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
[8]   Geometric harmonics: A novel tool for multiscale out-of-sample extension of empirical functions [J].
Coifman, Ronald R. ;
Lafon, Stephane .
APPLIED AND COMPUTATIONAL HARMONIC ANALYSIS, 2006, 21 (01) :31-52
[9]   Diffusion maps [J].
Coifman, Ronald R. ;
Lafon, Stephane .
APPLIED AND COMPUTATIONAL HARMONIC ANALYSIS, 2006, 21 (01) :5-30
[10]  
Dietrich F, 2021, Arxiv, DOI [arXiv:2110.02296, DOI 10.48550/ARXIV.2110.02296]