Manipulating and measuring variation in deep neural network (DNN) representations of objects

被引:0
作者
Chow, Jason K. [1 ]
Palmeri, Thomas J. [1 ]
机构
[1] Vanderbilt Univ, Dept Psychol, 111 21st Ave South, Nashville, TN 37240 USA
关键词
Deep neural networks; Individual differences; Simulation; Visual perception; INDIVIDUAL-DIFFERENCES; ORGANIZATION; INFORMATION; MODEL;
D O I
10.1016/j.cognition.2024.105920
中图分类号
B84 [心理学];
学科分类号
04 ; 0402 ;
摘要
We explore how DNNs can be used to develop a computational understanding of individual differences in highlevel visual cognition given their ability to generate rich meaningful object representations informed by their architecture, experience, and training protocols. As a first step to quantifying individual differences in DNN representations, we systematically explored the robustness of a variety of representational similarity measures: Representational Similarity Analysis (RSA), Centered Kernel Alignment (CKA), and Projection-Weighted Canonical Correlation Analysis (PWCCA), with an eye to how these measures are used in cognitive science, cognitive neuroscience, and vision science. To manipulate object representations, we next created a large set of models varying in random initial weights and random training image order, training image frequencies, training category frequencies, and model size and architecture and measured the representational variation caused by each manipulation. We examined both small (All-CNN-C) and commonly-used large (VGG and ResNet) DNN architectures. To provide a comparison for the magnitude of representational differences, we established a baseline based on the representational variation caused by image-augmentation techniques used to train those DNNs. We found that variation in model randomization and model size never exceeded baseline. By contrast, differences in training image frequency and training category frequencies caused representational variation that exceeded baseline, with training category frequency manipulations exceeding baseline earlier in the networks. These findings provide insights into the magnitude of representational variations that can be expected with a range of manipulations and provide a springboard for further exploration of systematic model variations aimed at modeling individual differences in high-level visual cognition.
引用
收藏
页数:19
相关论文
共 50 条
  • [41] MULTI-TASK DEEP NEURAL NETWORK WITH SHARED HIDDEN LAYERS: BREAKING DOWN THE WALL BETWEEN EMOTION REPRESENTATIONS
    Zhang, Yue
    Liu, Yifan
    Weninger, Felix
    Schuller, Bjoern
    2017 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2017, : 4990 - 4994
  • [42] Estimating and interpreting nonlinear receptive field of sensory neural responses with deep neural network models
    Keshishian, Menoua
    Akbari, Hassan
    Khalighinejad, Bahar
    Herrero, Jose L.
    Mehta, Ashesh D.
    Mesgarani, Nima
    ELIFE, 2020, 9 : 1 - 24
  • [43] Design of neural network for manipulating gas refinery sweetening regenerator column outputs
    Salooki, Mandi Koolivand
    Abedini, Reza
    Adib, Hooman
    Koolivand, Hadis
    SEPARATION AND PURIFICATION TECHNOLOGY, 2011, 82 : 1 - 9
  • [44] Modelling nonlinear shear creep behaviour of a structural adhesive using deep neural networks (DNN)
    Wang, Songbo
    Shui, Farun
    Stratford, Tim
    Su, Jun
    Li, Biao
    CONSTRUCTION AND BUILDING MATERIALS, 2024, 414
  • [45] Measuring River Wetted Width From Remotely Sensed Imagery at the Subpixel Scale With a Deep Convolutional Neural Network
    Ling, Feng
    Boyd, Doreen
    Ge, Yong
    Foody, Giles M.
    Li, Xiaodong
    Wang, Lihui
    Zhang, Yihang
    Shi, Lingfei
    Shang, Cheng
    Li, Xinyan
    Du, Yun
    WATER RESOURCES RESEARCH, 2019, 55 (07) : 5631 - 5649
  • [46] Probabilistic power flow with topology changes based on deep neural network
    Xiang, Mingxu
    Yu, Juan
    Yang, Zhifang
    Yang, Yan
    Yu, Hongxin
    He, He
    INTERNATIONAL JOURNAL OF ELECTRICAL POWER & ENERGY SYSTEMS, 2020, 117
  • [47] Deep neural network based the oxygen content of boiler flue gas
    Tang, Zhenhao
    Chai, Xiangying
    Zhao, Bo
    2019 CHINESE AUTOMATION CONGRESS (CAC2019), 2019, : 1720 - 1724
  • [48] Recognizing Brain States Using Deep Sparse Recurrent Neural Network
    Wang, Han
    Zhao, Shijie
    Dong, Qinglin
    Cui, Yan
    Chen, Yaowu
    Han, Junwei
    Xie, Li
    Liu, Tianming
    IEEE TRANSACTIONS ON MEDICAL IMAGING, 2019, 38 (04) : 1058 - 1068
  • [49] Multi-column deep neural network for traffic sign classification
    Ciresan, Dan
    Meier, Ueli
    Masci, Jonathan
    Schmidhuber, Juergen
    NEURAL NETWORKS, 2012, 32 : 333 - 338
  • [50] Deeplce: A Deep Neural Network Approach To Identify Ice and Water Molecules
    Fulford, Maxwell
    Salvalaglio, Matteo
    Molteni, Carla
    JOURNAL OF CHEMICAL INFORMATION AND MODELING, 2019, 59 (05) : 2141 - 2149