Using deep learning to study emotional behavior in rodent models

被引:8
|
作者
Kuo, Jessica Y. [1 ]
Denman, Alexander J. [1 ]
Beacher, Nicholas J. [1 ]
Glanzberg, Joseph T. [1 ]
Zhang, Yan [1 ]
Li, Yun [2 ]
Lin, Da-Ting [1 ]
机构
[1] Natl Inst Drug Abuse, Intramural Res Program, NIH, Baltimore, MD 21224 USA
[2] Univ Wyoming, Dept Zool & Physiol, Laramie, WY USA
来源
FRONTIERS IN BEHAVIORAL NEUROSCIENCE | 2022年 / 16卷
基金
美国国家卫生研究院;
关键词
deep learning; emotion; supervised learning; unsupervised learning; self-supervised learning; neural recording; pose estimation; COCAINE; ANXIETY; STRESS; REWARD; MOTOR; EXPOSURE; NEURONS; SYSTEM;
D O I
10.3389/fnbeh.2022.1044492
中图分类号
B84 [心理学]; C [社会科学总论]; Q98 [人类学];
学科分类号
03 ; 0303 ; 030303 ; 04 ; 0402 ;
摘要
Quantifying emotional aspects of animal behavior (e.g., anxiety, social interactions, reward, and stress responses) is a major focus of neuroscience research. Because manual scoring of emotion-related behaviors is time-consuming and subjective, classical methods rely on easily quantified measures such as lever pressing or time spent in different zones of an apparatus (e.g., open vs. closed arms of an elevated plus maze). Recent advancements have made it easier to extract pose information from videos, and multiple approaches for extracting nuanced information about behavioral states from pose estimation data have been proposed. These include supervised, unsupervised, and self-supervised approaches, employing a variety of different model types. Representations of behavioral states derived from these methods can be correlated with recordings of neural activity to increase the scope of connections that can be drawn between the brain and behavior. In this mini review, we will discuss how deep learning techniques can be used in behavioral experiments and how different model architectures and training paradigms influence the type of representation that can be obtained.
引用
收藏
页数:9
相关论文
共 50 条
  • [31] Research on Video Abnormal Behavior Detection Based on Deep Learning
    Peng Jiali
    Zhao Yingliang
    Wang Liming
    LASER & OPTOELECTRONICS PROGRESS, 2021, 58 (06)
  • [32] Beyond observation: Deep learning for animal behavior and ecological conservation
    Saoud, Lyes Saad
    Sultan, Atif
    Elmezain, Mahmoud
    Heshmat, Mohamed
    Seneviratne, Lakmal
    Hussain, Irfan
    ECOLOGICAL INFORMATICS, 2024, 84
  • [33] Virtual Footwear Try-On in Augmented Reality Using Deep Learning Models
    Chou, Ting
    Chu, Chih-Hsing
    Liu, Shengjun
    JOURNAL OF COMPUTING AND INFORMATION SCIENCE IN ENGINEERING, 2024, 24 (03)
  • [34] Classification of rotator cuff tears in ultrasound images using deep learning models
    Ho, Thao Thi
    Kim, Geun-Tae
    Kim, Taewoo
    Choi, Sanghun
    Park, Eun-Kee
    MEDICAL & BIOLOGICAL ENGINEERING & COMPUTING, 2022, 60 (05) : 1269 - 1278
  • [35] Classification of Poetry Text Into the Emotional States Using Deep Learning Technique
    Ahmad, Shakeel
    Asghar, Muhammad Zubair
    Alotaibi, Fahad Mazaed
    Khan, Sherafzal
    IEEE ACCESS, 2020, 8 : 73865 - 73878
  • [36] Forest road detection using deep learning models
    Caliskan, Erhan
    Sevim, Yusuf
    GEOCARTO INTERNATIONAL, 2022, 37 (20) : 5875 - 5890
  • [37] A comparative study of deep transfer learning models for malware classification using image datasets
    Ranjan, Ranjeet Kumar
    Singh, Amit
    INTERNATIONAL JOURNAL OF INFORMATION AND COMPUTER SECURITY, 2023, 21 (3-4) : 293 - 319
  • [38] Classification of Lung Diseases Using Deep Learning Models
    Zak, Matthew
    Krzyzak, Adam
    COMPUTATIONAL SCIENCE - ICCS 2020, PT III, 2020, 12139 : 621 - 634
  • [39] Strawberry Ripeness Detection Using Deep Learning Models
    Mi, Zhiyuan
    Yan, Wei Qi
    BIG DATA AND COGNITIVE COMPUTING, 2024, 8 (08)
  • [40] Sentiment Analysis of Product Reviews Using Deep Learning and Transformer Models: A Comparative Study
    Kusal, Sheetal
    Patil, Shruti
    Gupta, Aashna
    Saple, Harsh
    Jaiswal, Devashish
    Deshpande, Vaishnavi
    Kotecha, Ketan
    ARTIFICIAL INTELLIGENCE: THEORY AND APPLICATIONS, VOL 1, AITA 2023, 2024, 843 : 183 - 204