Bringing order into the realm of Transformer-based language models for artificial intelligence and law

被引:9
|
作者
Greco, Candida M. [1 ]
Tagarelli, Andrea [1 ]
机构
[1] Univ Calabria, Dept Comp Engn Modeling Elect & Syst Engn DIMES, I-87036 Arcavacata Di Rende, CS, Italy
关键词
Language models; BERT; GPT; Legal search; Legal document review; Legal outcome prediction; Retrieval; Entailment; Inference; Caselaw data; Statutory law data; Benchmarks; AI for law;
D O I
10.1007/s10506-023-09374-7
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Transformer-based language models (TLMs) have widely been recognized to be a cutting-edge technology for the successful development of deep-learning-based solutions to problems and applications that require natural language processing and understanding. Like for other textual domains, TLMs have indeed pushed the state-of-the-art of AI approaches for many tasks of interest in the legal domain. Despite the first Transformer model being proposed about six years ago, there has been a rapid progress of this technology at an unprecedented rate, whereby BERT and related models represent a major reference, also in the legal domain. This article provides the first systematic overview of TLM-based methods for AI-driven problems and tasks in the legal sphere. A major goal is to highlight research advances in this field so as to understand, on the one hand, how the Transformers have contributed to the success of AI in supporting legal processes, and on the other hand, what are the current limitations and opportunities for further research development.
引用
收藏
页码:863 / 1010
页数:148
相关论文
共 50 条
  • [21] Tweets Topic Classification and Sentiment Analysis Based on Transformer-Based Language Models
    Mandal, Ranju
    Chen, Jinyan
    Becken, Susanne
    Stantic, Bela
    VIETNAM JOURNAL OF COMPUTER SCIENCE, 2023, 10 (02) : 117 - 134
  • [22] Transformer-based Language Models for Semantic Search and Mobile Applications Retrieval
    Coelho, Joao
    Neto, Antonio
    Tavares, Miguel
    Coutinho, Carlos
    Oliveira, Joao
    Ribeiro, Ricardo
    Batista, Fernando
    PROCEEDINGS OF THE 13TH INTERNATIONAL JOINT CONFERENCE ON KNOWLEDGE DISCOVERY, KNOWLEDGE ENGINEERING AND KNOWLEDGE MANAGEMENT (KDIR), VOL 1:, 2021, : 225 - 232
  • [23] Dynamic Low-rank Estimation for Transformer-based Language Models
    Huai, Ting
    Lie, Xiao
    Gao, Shangqian
    Hsu, Yenchang
    Shen, Yilin
    Jin, Hongxia
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EMNLP 2023), 2023, : 9275 - 9287
  • [24] Roles and Utilization of Attention Heads in Transformer-based Neural Language Models
    Jo, Jae-young
    Myaeng, Sung-hyon
    58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020), 2020, : 3404 - 3417
  • [25] Pre-training and Evaluating Transformer-based Language Models for Icelandic
    Daoason, Jon Friorik
    Loftsson, Hrafn
    LREC 2022: THIRTEEN INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION, 2022, : 7386 - 7391
  • [26] Shared functional specialization in transformer-based language models and the human brain
    Kumar, Sreejan
    Sumers, Theodore R.
    Yamakoshi, Takateru
    Goldstein, Ariel
    Hasson, Uri
    Norman, Kenneth A.
    Griffiths, Thomas L.
    Hawkins, Robert D.
    Nastase, Samuel A.
    NATURE COMMUNICATIONS, 2024, 15 (01)
  • [27] Localizing in-domain adaptation of transformer-based biomedical language models
    Buonocore, Tommaso Mario
    Crema, Claudio
    Redolfi, Alberto
    Bellazzi, Riccardo
    Parimbelli, Enea
    JOURNAL OF BIOMEDICAL INFORMATICS, 2023, 144
  • [28] Accelerating Training of Transformer-Based Language Models with Progressive Layer Dropping
    Zhang, Minjia
    He, Yuxiong
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS (NEURIPS 2020), 2020, 33
  • [29] Arlo: Serving Transformer-based Language Models with Dynamic Input Lengths
    Tan, Xin
    Li, Jiamin
    Yang, Yitao
    Li, Jingzong
    Xu, Hong
    53RD INTERNATIONAL CONFERENCE ON PARALLEL PROCESSING, ICPP 2024, 2024, : 367 - 376
  • [30] Enhancing Address Data Integrity using Transformer-Based Language Models
    Kurklu, Omer Faruk
    Akagiunduz, Erdem
    32ND IEEE SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS CONFERENCE, SIU 2024, 2024,