Classification of multiple emotional states from facial expressions in head-fixed mice using a deep learning-based image analysis
被引:0
作者:
论文数: 引用数:
h-index:
机构:
Tanaka, Yudai
[1
,2
]
论文数: 引用数:
h-index:
机构:
Nakata, Takuto
[1
,2
]
论文数: 引用数:
h-index:
机构:
Hibino, Hiroshi
[3
]
Nishiyama, Masaaki
论文数: 0引用数: 0
h-index: 0
机构:
Kanazawa Univ, Grad Sch Med Sci, Dept Histol & Cell Biol, Kanazawa, JapanKanazawa Univ, Grad Sch Med Sci, Dept Histol & Cell Biol, Kanazawa, Japan
Nishiyama, Masaaki
[1
]
Ino, Daisuke
论文数: 0引用数: 0
h-index: 0
机构:
Kanazawa Univ, Grad Sch Med Sci, Dept Histol & Cell Biol, Kanazawa, Japan
Osaka Univ, Grad Sch Med, Dept Pharmacol, Kanazawa, JapanKanazawa Univ, Grad Sch Med Sci, Dept Histol & Cell Biol, Kanazawa, Japan
Ino, Daisuke
[1
,3
]
机构:
[1] Kanazawa Univ, Grad Sch Med Sci, Dept Histol & Cell Biol, Kanazawa, Japan
[2] Kanazawa Univ, Grad Sch Med Sci, Dept Mol & Cellular Pathol, Kanazawa, Japan
[3] Osaka Univ, Grad Sch Med, Dept Pharmacol, Kanazawa, Japan
来源:
PLOS ONE
|
2023年
/
18卷
/
07期
关键词:
PHENOTYPES;
SCALE;
D O I:
10.1371/journal.pone.0288930
中图分类号:
O [数理科学和化学];
P [天文学、地球科学];
Q [生物科学];
N [自然科学总论];
学科分类号:
07 ;
0710 ;
09 ;
摘要:
Facial expressions are widely recognized as universal indicators of underlying internal states in most species of animals, thereby presenting as a non-invasive measure for assessing physical and mental conditions. Despite the advancement of artificial intelligence-assisted tools for automated analysis of voluminous facial expression data in human subjects, the corresponding tools for mice still remain limited so far. Considering that mice are the most prevalent model animals for studying human health and diseases, a comprehensive characterization of emotion-dependent patterns of facial expressions in mice could extend our knowledge on the basis of emotions and the related disorders. Here, we present a framework for the development of a deep learning-powered tool for classifying facial expressions in head-fixed mouse. We demonstrate that our machine vision was capable of accurately classifying three different emotional states from lateral facial images in head-fixed mouse. Moreover, we objectively determined how our classifier characterized the differences among the facial images through the use of an interpretation technique called Gradient-weighted Class Activation Mapping. Importantly, our machine vision presumably discerned the data by leveraging multiple facial features. Our approach is likely to facilitate the non-invasive decoding of a variety of emotions from facial images in head-fixed mice.