Combining Variational Autoencoders and Transformer Language Models for Improved Password Generation

被引:2
作者
Biesner, David [1 ]
Cvejoski, Kostadin [1 ]
Sifa, Rafet [2 ]
机构
[1] Univ Bonn, Fraunhofer IAIS, Bonn, Germany
[2] Fraunhofer IAIS, St Augustin, Germany
来源
PROCEEDINGS OF THE 17TH INTERNATIONAL CONFERENCE ON AVAILABILITY, RELIABILITY AND SECURITY, ARES 2022 | 2022年
关键词
passwords; neural networks; transformers; language models; latent variable models; text generation;
D O I
10.1145/3538969.3539000
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Password generation techniques have recently been explored by leveraging deep-learning natural language processing (NLP) algorithms. Previous work has raised the state of the art for password guessing algorithms significantly, by approaching the problem using either variational autoencoders with CNN-based encoder and decoder architectures or transformer-based architectures (namely GPT2) for text generation. In this work we aim to combine both paradigms, introducing a novel architecture that leverages the expressive power of transformers with the natural sampling approach to text generation of variational autoencoders. We show how our architecture generates state-of-the-art results in password matching performance across multiple benchmark datasets.
引用
收藏
页数:6
相关论文
共 37 条
  • [11] Have I Been Pwnd, 2021, Leak V1
  • [12] Heywood Dustin, 2021, Hashcat Raking generated2.rule
  • [13] Hitaj Briland, 2017, LNCS, V1464, P217
  • [14] Hunt Troy, 2021, Here's Why [Insert Thing Here] Is Not a Password Killer
  • [15] John The Ripper, 2021, password cracker
  • [16] King DB, 2015, ACS SYM SER, V1214, P1
  • [17] Lin ZJ, 2020, Arxiv, DOI arXiv:2003.12738
  • [18] linkedin, 2021, Leak
  • [19] Liu H, 2022, Arxiv, DOI arXiv:2109.04314
  • [20] Melicher W, 2016, PROCEEDINGS OF THE 25TH USENIX SECURITY SYMPOSIUM, P175