Semi-Relaxation Supervised Hashing for Cross-Modal Retrieval

被引:49
|
作者
Zhang, Peng-Fei [1 ]
Li, Chuan-Xiang [1 ]
Liu, Meng-Yuan [1 ]
Nie, Liqiang [1 ]
Xu, Xin-Shun [1 ]
机构
[1] Shandong Univ, Sch Comp Sci & Technol, Jinan, Shandong, Peoples R China
基金
中国国家自然科学基金;
关键词
multimodal; Hashing; Cross-Modal Search; Approximate Nearest Neighbor Search;
D O I
10.1145/3123266.3123320
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Recently, some cross-modal hashing methods have been devised for cross-modal search task. Essentially, given a similarity matrix, most of these methods tackle a discrete optimization problem by separating it into two stages, i.e., first relaxing the binary constraints and finding a solution of the relaxed optimization problem, then quantizing the solution to obtain the binary codes. This scheme will generate large quantization error. Some discrete optimization methods have been proposed to tackle this; however, the generation of the binary codes is independent of the features in the original space, which makes it not robust to noise. TO consider these problems, in this paper, we propose a novel supervised cross modal hashing method-Semi-Relaxation Supervised Hashing (SRSH). It can learn the hash functions and the binary codes simultaneously. At the same time, to tackle the optimization problem, it relaxes a part of binary constraints, instead of all of them, by introducing an intermediate representation variable. By doing this, the quantization error can be reduced and the optimization problem can also be easily solved by an iterative algorithm proposed in this paper. Extensive experimental results on three benchmark datasets demonstrate that SRSH can obtain competitive results and outperform state-of-the-art unsupervised and supervised cross-modal hashing methods.
引用
收藏
页码:1762 / 1770
页数:9
相关论文
共 50 条
  • [41] Online Adaptive Supervised Hashing for Large-Scale Cross-Modal Retrieval
    Su, Ruoqi
    Wang, Di
    Huang, Zhen
    Liu, Yuan
    An, Yaqiang
    IEEE ACCESS, 2020, 8 : 206360 - 206370
  • [42] Autoencoder-based self-supervised hashing for cross-modal retrieval
    Li, Yifan
    Wang, Xuan
    Cui, Lei
    Zhang, Jiajia
    Huang, Chengkai
    Luo, Xuan
    Qi, Shuhan
    MULTIMEDIA TOOLS AND APPLICATIONS, 2021, 80 (11) : 17257 - 17274
  • [43] Autoencoder-based self-supervised hashing for cross-modal retrieval
    Yifan Li
    Xuan Wang
    Lei Cui
    Jiajia Zhang
    Chengkai Huang
    Xuan Luo
    Shuhan Qi
    Multimedia Tools and Applications, 2021, 80 : 17257 - 17274
  • [44] Efficient discrete supervised hashing for large-scale cross-modal retrieval
    Yao, Tao
    Han, Yaru
    Wang, Ruxin
    Kong, Xiangwei
    Yan, Lianshan
    Fu, Haiyan
    Tian, Qi
    NEUROCOMPUTING, 2020, 385 (385) : 358 - 367
  • [45] Cross-modal retrieval via label category supervised matrix factorization hashing
    Xue, Feng
    Wang, Wenbo
    Zhou, Wenjie
    Zeng, Tao
    Yang, Tian
    PATTERN RECOGNITION LETTERS, 2020, 138 (138) : 469 - 475
  • [46] Multi-Kernel Supervised Hashing with Graph Regularization for Cross-Modal Retrieval
    Zhu, Ming
    Miao, Huanghui
    Tang, Jun
    2018 24TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2018, : 2717 - 2722
  • [47] Semi-supervised cross-modal hashing via modality-specific and cross-modal graph convolutional networks
    Wu, Fei
    Li, Shuaishuai
    Gao, Guangwei
    Ji, Yimu
    Jing, Xiao-Yuan
    Wan, Zhiguo
    PATTERN RECOGNITION, 2023, 136
  • [48] Online weighted hashing for cross-modal retrieval
    Jiang, Zining
    Weng, Zhenyu
    Li, Runhao
    Zhuang, Huiping
    Lin, Zhiping
    PATTERN RECOGNITION, 2025, 161
  • [49] Hierarchical Consensus Hashing for Cross-Modal Retrieval
    Sun, Yuan
    Ren, Zhenwen
    Hu, Peng
    Peng, Dezhong
    Wang, Xu
    IEEE TRANSACTIONS ON MULTIMEDIA, 2024, 26 : 824 - 836
  • [50] Random Online Hashing for Cross-Modal Retrieval
    Jiang, Kaihang
    Wong, Wai Keung
    Fang, Xiaozhao
    Li, Jiaxing
    Qin, Jianyang
    Xie, Shengli
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2025, 36 (01) : 677 - 691