Augmenting human innovation teams with artificial intelligence: Exploring transformer-based language models

被引:115
|
作者
Bouschery, Sebastian G. [1 ,3 ]
Blazevic, Vera [1 ,2 ]
Piller, Frank T. [1 ,3 ]
机构
[1] Rhein Westfal TH Aachen, Sch Business & Econ, Aachen, Germany
[2] Radboud Univ Nijmegen, Dept Mkt, Nijmegen, Netherlands
[3] Rhein Westfal TH Aachen, Sch Business & Econ, Templergraben 55, D-52056 Aachen, Germany
关键词
artificial intelligence; GPT-3; hybrid intelligence; innovation teams; prompt engineering; transformer-based language models; PERFORMANCE; CREATIVITY; KNOWLEDGE; SEARCH; IDEA;
D O I
10.1111/jpim.12656
中图分类号
F [经济];
学科分类号
02 ;
摘要
The use of transformer-based language models in artificial intelligence (AI) has increased adoption in various industries and led to significant productivity advancements in business operations. This article explores how these models can be used to augment human innovation teams in the new product development process, allowing for larger problem and solution spaces to be explored and ultimately leading to higher innovation performance. The article proposes the use of the AI-augmented double diamond framework to structure the exploration of how these models can assist in new product development (NPD) tasks, such as text summarization, sentiment analysis, and idea generation. It also discusses the limitations of the technology and the potential impact of AI on established practices in NPD. The article establishes a research agenda for exploring the use of language models in this area and the role of humans in hybrid innovation teams. (Note: Following the idea of this article, GPT-3 alone generated this abstract. Only minor formatting edits were performed by humans.)
引用
收藏
页码:139 / 153
页数:15
相关论文
共 50 条
  • [41] A Comparative Analysis of Transformer-based Protein Language Models for Remote Homology Prediction
    Kabir, Anowarul
    Moldwin, Asher
    Shehu, Amarda
    14TH ACM CONFERENCE ON BIOINFORMATICS, COMPUTATIONAL BIOLOGY, AND HEALTH INFORMATICS, BCB 2023, 2023,
  • [42] Transformer-based Language Models and Homomorphic Encryption: An Intersection with BERT-tiny
    Rovida, Lorenzo
    Leporati, Alberto
    PROCEEDINGS OF THE 10TH ACM INTERNATIONAL WORKSHOP ON SECURITY AND PRIVACY ANALYTICS, IWSPA 2024, 2024, : 3 - 13
  • [43] Boost Transformer-based Language Models with GPU-Friendly Sparsity and Quantization
    Yu, Chong
    Chen, Tao
    Gan, Zhongxue
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, 2023, : 218 - 235
  • [44] Empirical Study of Tweets Topic Classification Using Transformer-Based Language Models
    Mandal, Ranju
    Chen, Jinyan
    Becken, Susanne
    Stantic, Bela
    INTELLIGENT INFORMATION AND DATABASE SYSTEMS, ACIIDS 2021, 2021, 12672 : 340 - 350
  • [45] An Architecture for Accelerated Large-Scale Inference of Transformer-Based Language Models
    Ganiev, Amir
    Chapin, Colt
    de Andrade, Anderson
    Liu, Chen
    2021 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, NAACL-HLT 2021, 2021, : 163 - 169
  • [46] Influence of Language Proficiency on the Readability of Review Text and Transformer-based Models for Determining Language Proficiency
    Sazzed, Salim
    COMPANION PROCEEDINGS OF THE WEB CONFERENCE 2022, WWW 2022 COMPANION, 2022, : 881 - 886
  • [47] Stress Test Evaluation of Transformer-based Models in Natural Language Understanding Tasks
    Aspillaga, Carlos
    Carvallo, Andres
    Araujo, Vladimir
    PROCEEDINGS OF THE 12TH INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION (LREC 2020), 2020, : 1882 - 1894
  • [48] Classifying Drug Ratings Using User Reviews with Transformer-Based Language Models
    Shiju, Akhil
    He, Zhe
    2022 IEEE 10TH INTERNATIONAL CONFERENCE ON HEALTHCARE INFORMATICS (ICHI 2022), 2022, : 163 - 169
  • [49] Transformers-sklearn: a toolkit for medical language understanding with transformer-based models
    Feihong Yang
    Xuwen Wang
    Hetong Ma
    Jiao Li
    BMC Medical Informatics and Decision Making, 21
  • [50] Catching but a glimpse?-Navigating crowdsourced solution spaces with transformer-based language models
    Just, Julian
    Hutter, Katja
    Fueller, Johann
    CREATIVITY AND INNOVATION MANAGEMENT, 2024, 33 (04) : 718 - 741