Towards Continuous Benchmarking: An Automated Performance Evaluation Framework for High Performance Software

被引:5
作者
Anzt, Hartwig [1 ,2 ]
Chen, Yen-Chen [3 ]
Cojean, Terry [1 ]
Dongarra, Jack [2 ,4 ,5 ]
Flegar, Goran [6 ]
Nayak, Pratik [1 ]
Quintana-Orti, Enrique S. [7 ]
Tsai, Yuhsiang M. [3 ]
Wang, Weichung [3 ]
机构
[1] Karlsruhe Inst Technol, Karlsruhe, Germany
[2] Univ Tennessee, Knoxville, TN 37996 USA
[3] Taiwan Natl Univ, Taipei, Taiwan
[4] Oak Ridge Natl Lab, Oak Ridge, TN USA
[5] Univ Manchester, Manchester, Lancs, England
[6] Univ Jaime I, Castellon de La Plana, Castello, Spain
[7] Univ Politecn Valencia, Valencia, Spain
来源
PROCEEDINGS OF THE PLATFORM FOR ADVANCED SCIENTIFIC COMPUTING CONFERENCE (PASC '19) | 2019年
关键词
interactive performance visualization; automated performance benchmarking; continuous integration; healthy software lifecycle;
D O I
10.1145/3324989.3325719
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
We present an automated performance evaluation framework that enables an automated workflow for testing and performance evaluation of software libraries. Integrating this component into an ecosystem enables sustainable software development, as a community effort, via a web application for interactively evaluating the performance of individual software components. The performance evaluation tool is based exclusively on web technologies, which removes the burden of downloading performance data or installing additional software. We employ this framework for the GINKGO software ecosystem, but the framework can be used with essentially any software project, including the comparison between different software libraries. The Continuous Integration (CI) framework of GINKGO is also extended to automatically run a benchmark suite on predetermined HPC systems, store the state of the machine and the environment along with the compiled binaries, and collect results in a publicly accessible performance data repository based on Git. The GINKGO performance explorer (GPE) can be used to retrieve the performance data from the repository, and visualizes it in a web browser. GPE also implements an interface that allows users to write scripts, archived in a Git repository, to extract particular data, compute particular metrics, and visualize them in many different formats (as specified by the script). The combination of these approaches creates a workflow which enables performance reproducibility and software sustainability of scientific software. In this paper, we present example scripts that extract and visualize performance data for GINKGO'S SpMV kernels that allow users to identify the optimal kernel for specific problem characteristics.
引用
收藏
页数:11
相关论文
共 10 条
[1]  
[Anonymous], 2018, SuiteSparse Matrix Collection
[2]  
Ecma International, 2017, JSON DATA INTERCHANG
[3]   Overcoming Load Imbalance for Irregular Sparse Matrices [J].
Flegar, Goran ;
Anzt, Hartwig .
PROCEEDINGS OF IA3 2017: SEVENTH WORKSHOP ON IRREGULAR APPLICATIONS: ARCHITECTURES AND ALGORITHMS, 2017,
[4]  
Fowler M., 2005, Continuous integration
[5]   A Note on Performance Profiles for Benchmarking Software [J].
Gould, Nicholas ;
Scott, Jennifer .
ACM TRANSACTIONS ON MATHEMATICAL SOFTWARE, 2016, 43 (02)
[6]  
Higham D., 2005, MATLAB GUIDE, DOI [10.1137/1.9780898717891 arXiv:https://epubs.siam.org/doi/pdf/10.1137/1.9780898717891, DOI 10.1137/1.9780898717891ARXIV:HTTPS://EPUBS.SIAM.ORG/DOI/PDF/10.1137/1.9780898717891]
[7]  
Kitware Inc., 2012, CMAKE
[8]  
Kolawa Adam., 2007, Automated Defect Prevention: Best Practices in Software Management
[9]  
van Heesch D., 2008, Doxygen: Source code documentation generator tool
[10]  
Yip M., RapidJSON: A Fast JSON Parser/Generator for C++ With Both SAX/DOM Style API