Candidate point selection using a self-attention mechanism for generating a smooth volatility surface under the SABR model

被引:2
作者
Kim, Hyeonuk [1 ]
Park, Kyunghyun [1 ]
Jeon, Junkee [2 ,3 ]
Song, Changhoon [1 ]
Bae, Jungwoo [1 ]
Kim, Yongsik [4 ]
Kang, Myungjoo [1 ]
机构
[1] Seoul Natl Univ, Dept Math Sci, Seoul, South Korea
[2] Kyung Hee Univ, Dept Appl Math, Yongin, South Korea
[3] Kyung Hee Univ, Inst Nat Sci, Yongin, South Korea
[4] Korea Asset Pricing Corp, Financial Engn Ctr, Seoul, South Korea
基金
新加坡国家研究基金会;
关键词
Candidate point selection; Self-attention mechanism; Transformer network; SABR model; Smooth implied volatility surface; ALGORITHM;
D O I
10.1016/j.eswa.2021.114640
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In real markets, generating a smooth implied volatility surface requires an interpolation of the calibrated parameters by using smooth parametric functions. For this interpolation, practitioners do not use all the discrete parameter points but manually select candidate parameter points through time-consuming adjustments (e.g., removing outliers, comparing with the surface from the previous day, and considering daily market indexes) to generate a smooth and robust surface. In this paper, we propose neural network models that assist practitioners in generating a smooth implied volatility surface under the SABR (Hagan et al., 2002) model. Utilizing the selfattention mechanism of a transformer network (Vaswani et al., 2017) as a backbone network, we design two models: one that orders the parameter points by their likelihood to be selected as candidate parameter points and one that determines the candidate point set among the combinations of high-priority points. Experimental results from a 3-year period of real market S&P500 and KOSPI200 data show that the combination of two models can assist practitioners in the point selection task.
引用
收藏
页数:21
相关论文
共 29 条
  • [1] Ackerer D, 2020, P 34 C NEUR INF PROC, V33, P11552
  • [2] Semi-parametric forecasts of the implied volatility surface using regression trees
    Audrino, Francesco
    Colangelo, Dominik
    [J]. STATISTICS AND COMPUTING, 2010, 20 (04) : 421 - 434
  • [3] Bloch D.A., 2019, NEURAL NETWORKS BASE
  • [4] Chang C.-C., 2011, ACM T INTEL SYST TEC, V2, P1, DOI DOI 10.1145/1961189.1961199
  • [5] A study on SMO-type decomposition methods for support vector machines
    Chen, Pai-Hsuen
    Fan, Rong-En
    Lin, Chih-Jen
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 2006, 17 (04): : 893 - 908
  • [6] Derman E., 1996, Financial analysts journal, V52, P25
  • [7] Derman E., 1998, International Journal of Theoretical and Applied Finance, V1, P61, DOI DOI 10.1142/S0219024998000059
  • [8] Devlin J, 2019, 2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1, P4171
  • [9] Dupire B., 1994, Risk, V7, P271
  • [10] Gatheral J., 2011, VOLATILITY SURFACE P