Neural Compatibility Modeling With Probabilistic Knowledge Distillation

被引:32
作者
Han, Xianjing [1 ]
Song, Xuemeng [1 ]
Yao, Yiyang [2 ]
Xu, Xin-Shun [3 ]
Nie, Liqiang [1 ]
机构
[1] Shandong Univ, Sch Comp Sci & Technol, Qingdao 266237, Peoples R China
[2] State Grid Zhejiang Elect Power Co Ltd, Hangzhou 310007, Peoples R China
[3] Shandong Univ, Sch Software, Jinan 250101, Peoples R China
基金
中国国家自然科学基金;
关键词
Multi-modal; compatibility modeling; probabilistic knowledge distillation; NETWORKS;
D O I
10.1109/TIP.2019.2936742
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In modern society, clothing matching plays a pivotal role in people's daily life, as suitable outfits can beautify their appearance directly. Nevertheless, how to make a suitable outfit has become a daily headache for many people, especially those who do not have much sense of aesthetics. In the light of this, many research efforts have been dedicated to the task of complementary clothing matching and have achieved great success relying on the advanced data-driven neural networks. However, most existing methods overlook the rich valuable knowledge accumulated by our human beings in the fashion domain, especially the rules regarding clothing matching, like "coats go with dresses" and "silk tops cannot go with chiffon bottoms". Towards this end, in this work, we propose a knowledge-guided neural compatibility modeling scheme, which is able to incorporate the rich fashion domain knowledge to enhance the performance of the compatibility modeling in the context of clothing matching. To better integrate the huge and implicit fashion domain knowledge into the data-driven neural networks, we present a probabilistic knowledge distillation (PKD) method, which is able to encode vast knowledge rules in a probabilistic manner. Extensive experiments on two real-world datasets have verified the guidance of rules from different sources and demonstrated the effectiveness and portability of our model. As a byproduct, we released the codes and involved parameters to benefit the research community.
引用
收藏
页码:871 / 882
页数:12
相关论文
共 46 条
[1]  
Alashkar T, 2017, AAAI CONF ARTIF INTE, P941
[2]  
Anil R., 2018, Large scale distributed neural network training through online distillation
[3]  
[Anonymous], 1991, P NEUR
[4]  
[Anonymous], 2013, NIPS
[5]  
[Anonymous], 2015, ARXIV151003519
[6]  
Bahdanau D., 2015, P INT C MACH LEARN I
[7]   Embedding Factorization Models for Jointly Recommending Items and User Generated Lists [J].
Cao, Da ;
Nie, Liqiang ;
He, Xiangnan ;
Wei, Xiaochi ;
Zhu, Shunzhi ;
Chua, Tat-Seng .
SIGIR'17: PROCEEDINGS OF THE 40TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, 2017, :585-594
[8]   Aspect-Aware Latent Factor Model: Rating Prediction with Ratings and Reviews [J].
Cheng, Zhiyong ;
Ding, Ying ;
Zhu, Lei ;
Kankanhalli, Mohan .
WEB CONFERENCE 2018: PROCEEDINGS OF THE WORLD WIDE WEB CONFERENCE (WWW2018), 2018, :639-648
[9]   Large-Scale Cross-Modality Search via Collective Matrix Factorization Hashing [J].
Ding, Guiguang ;
Guo, Yuchen ;
Zhou, Jile ;
Gao, Yue .
IEEE TRANSACTIONS ON IMAGE PROCESSING, 2016, 25 (11) :5427-5440
[10]  
Felzenszwalb P, 2008, PROC CVPR IEEE, P1984