Model Parameter Re-optimization for Linear Hashing Based on Similarity Drive

被引:0
|
作者
Nie X.-S. [1 ]
Liu X.-B. [2 ,3 ]
Xi X.-M. [1 ]
Yin Y.-L. [3 ]
机构
[1] School of Computer Science and Technology, Shandong Jianzhu University, Ji'nan
[2] School of Computer Science and Technology, Shandong University, Qingdao
[3] School of Software, Shandong University, Ji'nan
来源
Ruan Jian Xue Bao/Journal of Software | 2020年 / 31卷 / 04期
基金
中国国家自然科学基金;
关键词
Content retrieval; Hash learning; Linear model; Parameter optimization; Similarity drive;
D O I
10.13328/j.cnki.jos.005918
中图分类号
学科分类号
摘要
By designing and optimizing an objective function, and combining the distribution of samples, hash learning learns the hash codes of samples. In the existing hashing models, linear model is widely used due to its conciseness and high efficiency. For the parameter optimization of linear hashing model, a model parameter re-optimization method is propose based on similarity drive, which can improve the precision of the existing linear model-based hashing algorithms. Given a hashing method, this method is firstly run for several times with obtaining several hash matrices. Then, some bits are selected for these hash matrices to obtain a new final hash matrix based on the similarity preserving degree and a fusion strategy. Finally, this new hash matrix is used to re-optimize the model parameters, and a better hash model is obtained for out-of-sample extension. Extensive experiments are performed based on three benchmark datasets and the results demonstrate the superior performance of the proposed framework. © Copyright 2020, Institute of Software, the Chinese Academy of Sciences. All rights reserved.
引用
收藏
页码:1039 / 1050
页数:11
相关论文
共 30 条
  • [1] Li W.J., Zhou Z.H., Learning to hash for big data: Current status and future trends, Chinese Science Bulletin, 60, pp. 485-490, (2015)
  • [2] Weiss Y., Torralba A., Fergus R., Spectral hashing, Advances in Neural Information Processing Systems, pp. 1753-1760, (2009)
  • [3] Liu Q., Liu G., Li L., Yuan X.T., Wang M., Liu W., Reversed spectral hashing, IEEE Trans. on Neural Networks and Learning Systems, 29, 6, pp. 2441-2449, (2018)
  • [4] Gong Y., Lazebnik S., Gordo A., Perronnin F., Iterative quantization: A procrustean approach to learning binary codes for large-scale image retrieval, IEEE Trans. on Pattern Analysis and Machine Intelligence, 35, 12, pp. 2916-2929, (2013)
  • [5] Zhu L., Shen J., Xie L., Cheng Z., Unsupervised visual hashing with semantic assistant for content-based image retrieval, IEEE Trans. on Knowledge and Data Engineering, 29, 2, pp. 472-486, (2017)
  • [6] Zhu L., Huang Z., Li Z., Xie L., Shen H.T., Exploring auxiliary context: Discrete semantic transfer hashing for scalable image retrieval, IEEE Trans. on Neural Networks and Learning Systems, 29, 11, pp. 5264-5276, (2018)
  • [7] Liu L., Shao L., Sequential compact code learning for unsupervised image hashing, IEEE Trans. on Neural Networks and Learning Systems, 27, 12, pp. 2526-2536, (2016)
  • [8] Jiang Q.Y., Li W.J., Scalable graph hashing with feature transformation, Proc. of the 24th Int'l Joint Conf. on Artificial Intelligence, pp. 248-2254, (2015)
  • [9] Shen F.M., Shen C.H., Shi Q.F., Et al., Inductive hashing on manifolds, Proc. of the IEEE Conf. on Computer Vision and Pattern Recognition, pp. 1562-1569, (2013)
  • [10] Lin G.S., Shen C.H., Van D.H., Supervised hashing using graph cuts and boosted decision trees, IEEE Trans. on Pattern Analysis and Machine Intelligence, 37, 11, pp. 2317-2331, (2015)