Neural networks based on power method and inverse power method for solving linear eigenvalue problems

被引:6
|
作者
Yang, Qihong [1 ]
Deng, Yangtao [1 ]
Yang, Yu [1 ]
He, Qiaolin [1 ]
Zhang, Shiquan [1 ]
机构
[1] Sichuan Univ, Sch Math, Chengdu, Peoples R China
基金
中国国家自然科学基金;
关键词
Power method; Inverse power method; Loss function; Neural network; Linear eigenvalue problem; Partial differential equation; ALGORITHM;
D O I
10.1016/j.camwa.2023.07.013
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
In this article, we propose two kinds of neural networks inspired by power method and inverse power method to solve linear eigenvalue problems. These neural networks share similar ideas with traditional methods, in which the differential operator is realized by automatic differentiation. The eigenfunction of the eigenvalue problem is learned by the neural network and the iterative algorithms are implemented by optimizing the specially defined loss function. The largest positive eigenvalue, smallest eigenvalue and interior eigenvalues with the given prior knowledge can be solved efficiently. We examine the applicability and accuracy of our methods in the numerical experiments in one dimension, two dimension and higher dimensions. Numerical results show that accurate eigenvalue and eigenfunction approximations can be obtained by our methods.
引用
收藏
页码:14 / 24
页数:11
相关论文
共 50 条