Constrained Training of Recurrent Neural Networks for Automata Learning

被引:1
|
作者
Aichernig, Bernhard K. [1 ]
Koenig, Sandra [2 ]
Mateis, Cristinel [2 ]
Pferscher, Andrea [1 ]
Schmidt, Dominik [2 ]
Tappler, Martin [1 ,3 ]
机构
[1] Graz Univ Technol, Inst Software Technol, Graz, Austria
[2] AIT Austrian Inst Technol, Vienna, Austria
[3] Graz Univ Technol, Silicon Austria Labs, SAL DES Lab, Graz, Austria
来源
SOFTWARE ENGINEERING AND FORMAL METHODS, SEFM 2022 | 2022年 / 13550卷
关键词
Automata learning; Machine learning; Recurrent neural networks; Bluetooth Low Energy; Model inference;
D O I
10.1007/978-3-031-17108-6_10
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
In this paper, we present a novel approach to learning finite automata with the help of recurrent neural networks. Our goal is not only to train a neural network that predicts the observable behavior of an automaton but also to learn its structure, including the set of states and transitions. In contrast to previous work, we constrain the training with a specific regularization term. We evaluate our approach with standard examples from the automata learning literature, but also include a case study of learning the finite-state models of real Bluetooth Low Energy protocol implementations. The results show that we can find an appropriate architecture to learn the correct automata in all considered cases.
引用
收藏
页码:155 / 172
页数:18
相关论文
共 50 条
  • [1] Learning minimal automata with recurrent neural networks
    Aichernig, Bernhard K.
    Koenig, Sandra
    Mateis, Cristinel
    Pferscher, Andrea
    Tappler, Martin
    SOFTWARE AND SYSTEMS MODELING, 2024, 23 (03): : 625 - 655
  • [2] Evaluating the Learning of Automata through the Use of Recurrent Neural Networks
    Lima, L.
    Sampaio, A.
    IEEE LATIN AMERICA TRANSACTIONS, 2018, 16 (10) : 2609 - 2616
  • [3] Training and extraction of fuzzy finite state automata in recurrent neural networks
    Chandra, Rohitash
    Omlin, Christian W.
    PROCEEDINGS OF THE SECOND IASTED INTERNATIONAL CONFERENCE ON COMPUTATIONAL INTELLIGENCE, 2006, : 271 - 275
  • [4] Training Deep Neural Networks with Constrained Learning Parameters
    Date, Prasanna
    Carothers, Christopher D.
    Mitchell, John E.
    Hendler, James A.
    Magdon-Ismail, Malik
    2020 INTERNATIONAL CONFERENCE ON REBOOTING COMPUTING (ICRC 2020), 2020, : 107 - 115
  • [5] Recurrent neural networks and finite automata
    Siegelmann, HT
    COMPUTATIONAL INTELLIGENCE, 1996, 12 (04) : 567 - 574
  • [6] Connecting Weighted Automata and Recurrent Neural Networks through Spectral Learning
    Rabusseau, Guillaume
    Li, Tianyu
    Precup, Doina
    22ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 89, 2019, 89
  • [7] AdaAX: Explaining Recurrent Neural Networks by Learning Automata with Adaptive States
    Hong, Dat
    Segre, Alberto Maria
    Wang, Tong
    PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022, 2022, : 574 - 584
  • [8] Local Structure Helps Learning Optimized Automata in Recurrent Neural Networks
    Binas, Jonathan
    Indiveri, Giacomo
    Pfeiffer, Michael
    2015 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2015,
  • [9] Toward Training Recurrent Neural Networks for Lifelong Learning
    Sodhani, Shagun
    Chandar, Sarath
    Bengio, Yoshua
    NEURAL COMPUTATION, 2020, 32 (01) : 1 - 35
  • [10] Connecting weighted automata, tensor networks and recurrent neural networks through spectral learning
    Tianyu Li
    Doina Precup
    Guillaume Rabusseau
    Machine Learning, 2024, 113 : 2619 - 2653