Sparse Harmonic Transforms: A New Class of Sublinear-Time Algorithms for Learning Functions of Many Variables

被引:12
|
作者
Choi, Bosu [1 ]
Iwen, Mark A. [2 ,3 ]
Krahmer, Felix [4 ]
机构
[1] UT Austin, Oden Inst Computat Engn & Sci, Austin, TX 78712 USA
[2] Michigan State Univ, Dept Math, E Lansing, MI 48824 USA
[3] Michigan State Univ, Dept Computat Math Sci & Engn CMSE, E Lansing, MI 48824 USA
[4] Tech Univ Munich, Dept Math, Munich, Germany
关键词
High-dimensional function approximation; Sublinear-time algorithms; Function learning; Sparse approximation; Compressive sensing; SIGNAL RECOVERY; APPROXIMATION; PURSUIT;
D O I
10.1007/s10208-020-09462-z
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
In this paper we develop fast and memory efficient numerical methods for learning functions of many variables that admit sparse representations in terms of general bounded orthonormal tensor product bases. Such functions appear in many applications including, e.g., various Uncertainty Quantification (UQ) problems involving the solution of parametric PDE that are approximately sparse in Chebyshev or Legendre product bases (Chkifa et al. in Polynomial approximation via compressed sensing of high-dimensional functions on lower sets., 2016; Rauhut and Schwab in Math Comput 86(304):661-700, 2017). We expect that our results provide a starting point for a new line of research on sublinear-time solution techniques for UQ applications of the type above which will eventually be able to scale to significantly higher-dimensional problems than what are currently computationally feasible. More concretely, let B be a finite Bounded Orthonormal Product Basis (BOPB) of cardinality vertical bar B vertical bar = N. Herein we will develop methods that rapidly approximate any functionfthat is sparse in the BOPB, that is,f: D subset of R-D -> C of the form f(x)= Sigma(b is an element of S) c(b)center dot b(x) with S subset of B of cardinality vertical bar S vertical bar = s << N. Our method adapts the CoSaMP algorithm (Needell and Tropp in Appl Comput Harmon Anal 26(3):301-321, 2009) to use additional function samples fromfalong a randomly constructed grid G subset of R-D with universal approximation properties in order to rapidly identify the multi-indices of the most dominant basis functions in S component by component during each CoSaMP iteration. It has a runtime of just(slogN)O(1), uses only(slogN)O(1) function evaluations on the fixed and nonadaptive grid G, and requires not more than(slogN)O(1) bits of memory. We emphasize that nothing about S or any of the coefficientscb is an element of Cis assumed in advance other than that S subset of B has |S| <= s. Both S and its related coefficientscb will be learned from the given function evaluations by the developed method. For s << N, the runtime(slogN)O(1) will be less than what is required to simply enumerate the elements of the basis B ; thus our method is the first approach applicable in a general BOPB framework that falls into the class referred to assublinear-time. This and the similarly reduced sample and memory requirements set our algorithm apart from previous works based on standard compressive sensing algorithms such as basis pursuit which typically store and utilize full intermediate basis representations of size omega(N) during the solution process.
引用
收藏
页码:275 / 329
页数:55
相关论文
共 4 条
  • [1] Sparse Harmonic Transforms: A New Class of Sublinear-Time Algorithms for Learning Functions of Many Variables
    Bosu Choi
    Mark A. Iwen
    Felix Krahmer
    Foundations of Computational Mathematics, 2021, 21 : 275 - 329
  • [2] Sparse harmonic transforms II: bests-term approximation guarantees for bounded orthonormal product bases in sublinear-time
    Choi, Bosu
    Iwen, Mark
    Volkmer, Toni
    NUMERISCHE MATHEMATIK, 2021, 148 (02) : 293 - 362
  • [3] Sparse harmonic transforms II: best s-term approximation guarantees for bounded orthonormal product bases in sublinear-time
    Bosu Choi
    Mark Iwen
    Toni Volkmer
    Numerische Mathematik, 2021, 148 : 293 - 362
  • [4] Sparse Fourier transforms on rank-1 lattices for the rapid and low-memory approximation of functions of many variables
    Gross, Craig
    Iwen, Mark
    Kammerer, Lutz
    Volkmer, Toni
    SAMPLING THEORY SIGNAL PROCESSING AND DATA ANALYSIS, 2022, 20 (01):