Hashing methods can overcome the problems of low retrieval efficiency and high storage cost. Existing hashing methods use either only one feature or multiple features as the input of one kernel function. The fact that different kernel functions have different roles and different characteristics and contain different information is ignored. In this paper, an adaptive multi-feature and multi-kernel hashing learning (MFMKH) algorithm is proposed, which can adaptively combine the feature weight coefficient and the kernel weight coefficient and double-combine the multi-feature and multi-kernel advantages. The fusion of these features in the algorithm solves the disadvantage of single feature containing insufficient information. In addition, the use of a variety of different kernel functions can compensate for the lack of a single-kernel learning ability and has the dual advantages of multi-feature fusion and multi-kernel learning. Experiments on the standard datasets IRMA, Ultrasound, and Cifar10 have shown that the retrieval performance of the proposed method clearly outperforms other similar kernel-based hashing learning methods. In addition, compared to the supervised deep hashing, the retrieval performance of the proposed method is competitive on the Cifar10 dataset in the case of the reduced training time.
国家重点研发计划云计算和大数据重点专项(2016YFB1000905)
国家自然科学基金项目(61672120)
重庆自然科学基金项目(cstc2015jcyjA40036)
[1] Naimi A I, Westreich D J. Big data: a revolution that will transform how we live, work, and think. Am J Epidemiol, 2014, 17: 181--183. Google Scholar
[2] Tu Z P. The Big Data Revolution. Guilin: Guangxi Normal University Press, 2013. 312--315 . Google Scholar
[3] Zhou Z H. Machine learning and data mining. Commun China Compute Fed, 2007, 3: 35--44 . Google Scholar
[4] Shao H, Cui W, Zhao H. Medical image retrieval based on visual contents and text information. In: Proceedings of International Conference on Systems, Man and Cybernetics, Hague, 2004. 1098--1103. Google Scholar
[5] Wong P W, Tretter D, Kite T. A Web-Based Secure System for the Distributed Printing of Documents and Images. J Visual Communication Image Representation, 1999, 10: 1-11 CrossRef Google Scholar
[6] He J, Li M, Zhang H J. Generalized Manifold-Ranking-Based Image Retrieval. IEEE Trans on Image Process, 2006, 15: 3170-3177 CrossRef ADS Google Scholar
[7] Wang F, Er G, Dai Q. Inequivalent manifold ranking for content-based image retrieval. In: Proceedings of International Conference on Image Processing, California, 2008. 173--176. Google Scholar
[8] Har-Peled S, Indyk P, Motwani R. Approximate nearest neighbor: towards removing the curse of dimensionality. Symp Theory Comput, 2012, 52: 604--613. Google Scholar
[9] Kulis B, Grauman K. Kernelized locality-sensitive hashing for scalable image search. In: Proceedings of International Conference on Computer Vision, Kyoto, 2009. 2130--2137. Google Scholar
[10] Raginsky M, Lazebnik S. Locality-sensitive binary codes from shift-invariant kernels. In: Proceedings of Advances in Neural Information Processing Systems, Vancouver, 2009. 1509--1517. Google Scholar
[11] Liu W, Wang J, Ji R, et al. Supervised hashing with kernels. In: Proceedings of Conference on Computer Vision and Pattern Recognition, Providence, 2012. 2074--2081. Google Scholar
[12] Liu X, He J, Lang B. Multiple feature kernel hashing for large-scale visual search. Pattern Recognition, 2014, 47: 748-757 CrossRef Google Scholar
[13] Li W J, Wang S, Kang W C. Feature learning based deep supervised hashing with pairwise labels. Comput Sci, 2015, 2: 1711--1717. Google Scholar
[14] Liu H, Wang R, Shan S, et al. Deep supervised hashing for fast image retrieval. In: Proceedings of Conference on Computer Vision and Pattern Recognition, Las Vegas, 2016. 2064--2072. Google Scholar
[15] Shi X S, Xing F Y, Cai J Z, et al. Kernel-based supervised discrete hashing for image retrieval. In: Proceedings of European Conference on Computer Vision, Amsterdam, 2016. 419--433. Google Scholar
[16] Xu Y, Shen F, Xu X. Large-scale image retrieval with supervised sparse hashing. Neurocomputing, 2017, 229: 45-53 CrossRef Google Scholar
[17] Gui J, Liu T, Sun Z. Fast supervised discrete hashing. IEEE Trans Pattern Anal Mach Intell, 2017, : 1-1 CrossRef PubMed Google Scholar
[18] Li W J, Zhou Z H. Learning to hash for big data: current status and future trends. Sci Bull, 2015, 60: 485--490 . Google Scholar
[19] Liu X L, Huang L, Deng C, et al. Multi-view complementary hash tables for nearest neighbor search. In: Proceedings of International Conference on Computer Vision, Santiago, 2015. 1107--1115. Google Scholar
[20] Xu H, Wang J, Li Z, et al. Complementary hashing for approximate nearest neighbor search. In: Proceedings of International Conference on Computer Vision, Barcelona, 2011. 1631--1638. Google Scholar
[21] Meina Kan , Dong Xu , Shiguang Shan . Semisupervised Hashing via Kernel Hyperplane Learning for Scalable Image Search. IEEE Trans Circuits Syst Video Technol, 2014, 24: 704-713 CrossRef Google Scholar
[22] Zhang D, Wang F, Si L. Composite hashing with multiple information sources. In Proceeding of International Conference on Research and Development in Information Retrieval, Beijing, 2011. 225--234. Google Scholar
[23] Song J, Yang Y, Huang Z, et al. Multiple feature hashing for real-time large scale near-duplicate video retrieval. In: Proceedings of International Conference on Multimedea, Scottsdale, 2011. 423--432. Google Scholar
[24] Multiview Locally Linear Embedding for Effective Medical Image Retrieval. PLoS ONE, 2013, 8: e82409 CrossRef PubMed ADS Google Scholar
[25] Gonen M, Alpaydin E. Multiple kernel learning algorithms. J Mach Learn Res, 2011, 12: 2211--2268. Google Scholar
[26] Tzortzis G, Likas A. Greedy unsupervised multiple kernel learning. In: Proceedings of Hellenic Conference on Artificial Intelligence, Berlin, 2012. 73--80. Google Scholar
[27] Akgül C B, Rubin D L, Napel S. Content-Based Image Retrieval in Radiology: Current Status and Future Directions. J Digit Imaging, 2011, 24: 208-222 CrossRef PubMed Google Scholar
[28] Weiss Y, Torralba A, Fergus R. Spectral Hashing. In: Proceedings of Conference on Neural Information Processing Systems, Auckland, 2008. 1753--1760. Google Scholar
[29] Zelnik-Manor L, Perona P. Self-tuning spectral clustering. In: Proceedings of Advances in Neural Information Processing Systems, Vancouver, 2004. 17: 1601--1608. Google Scholar
[30] He J, Liu W, Chang S F. Scalable similarity search with optimized kernel hashing. In: Proceedings of International Conference on Knowledge Discovery and Data Mining, Washington, 2010. 1129--1138. Google Scholar
[31] Liu W, Wang J, Kumar S, et al. Hashing with graphs. In: Proceedings of International Conference on Machine Learning, Bellevue, 2011. 1--8. Google Scholar
[32] Oliva A, Torralba A. Modeling the shape of the scene: a holistic representation of the spatial envelope. Int J Comp Vision, 2001, 42: 145-175 CrossRef Google Scholar
[33] Dalal N, Triggs B. Histograms of oriented gradients for human detection. In: Proceedings of Conference on Computer Vision and Pattern Recognition, San Diego, 2005. 886--893. Google Scholar
Figure 1
(Color online) The diagram of multi-feature and multi-kernel hashing learning
Figure 2
The structure of multi-feature and multi-kernel hashing learning
Figure 3
(Color online) Error vs. iterations on (a) IRMA dataset, (b) Ultrasound dataset, and (c) Cifar10 dataset
Figure 4
(Color online) The relationship based on kernel hashing methods between thelength of the hash code and the accuracy rate when thenumber of retrieved samples is 50
Figure 5
(Color online) The relationship based on kernel hashing methods between the length of the hash code and the recall rate when the number of retrieved samples is 50
Figure 6
(Color online) The relationship between thelength of the hash code and the accuracy rate based on single feature and single kernel hashing methods of M2FM3KH when thenumber of retrieved samples is 50
Figure 7
(Color online) The relationship between thelength of the hash code and the recall rate based on single feature and single kernel hashing methods of M2FM3KH when thenumber of retrieved samples is 50
|
---|
|
|
|
1. ${\mu _j} = \frac{1}{{{M_1}}}, j = 1,\ldots ,{M_1}$和${\alpha _i} = \frac{1}{{{M_2}}}, i = 1,\ldots ,{M_2}$; |
|
2. 固定$\alpha$和$\mu$, 求解$W$, $b$通过式( |
3. 固定$W$, $b$和$\alpha$, 求解$\mu$通过式( |
4. 固定$W$, $b$和$\mu$, 求解$\alpha$通过式( |
|
|
|
|
1. 根据式( |
2. 计算测试集哈希码${Y_{\rm test}}$与数据库中所有的图像的哈希码的汉明距离向量$P$; |
3. 对汉明距离向量$P$计算出的距离进行升序排序, 返回前$n$张图像. |
IRMA | Ultrasound | ||||||
Class | Sum | Training | Testing | Class | Sum | Training | Testing |
0 | 336 | 286 | 50 | 0 (Gallstone) | 521 | 371 | 150 |
1 | 215 | 165 | 50 | 1 (Gallbladder polyps) | 238 | 178 | 60 |
2 | 225 | 175 | 50 | 2 (normal Gallbladder ) | 154 | 104 | 50 |
3 | 576 | 476 | 100 | 3 (Hydronephrosis) | 386 | 286 | 100 |
4 | 217 | 167 | 50 | 4 (Kidney stones) | 219 | 169 | 50 |
5 | 205 | 155 | 50 | 5 (Renal cyst) | 197 | 147 | 50 |
6 | 194 | 144 | 50 | 6 (Normal kidneys) | 303 | 233 | 70 |
7 | 284 | 234 | 50 | 7 (Liver cysts) | 113 | 83 | 30 |
8 | 228 | 178 | 50 | 8 (Fatty liver) | 360 | 280 | 80 |
9 | 193 | 143 | 50 | 9 (Normal liver) | 191 | 131 | 60 |
Method | IRMA (MAP) | Ultrasound (MAP) | ||||||||
8 bits | 12 bits | 24 bits | 32 bits | 48 bits | 8 bits | 12 bits | 24 bits | 32 bits | 48 bits | |
MFKH | 0.620 | 0.680 | 0.640 | 0.670 | 0.582 | 0.109 | 0.121 | 0.137 | 0.163 | 0.155 |
SH | 0.214 | 0.315 | 0.323 | 0.416 | 0.409 | 0.063 | 0.061 | 0.054 | 0.052 | 0.056 |
SKLSH | 0.128 | 0.126 | 0.121 | 0.122 | 0.147 | 0.174 | 0.203 | 0.231 | 0.283 | 0.206 |
KLSH | 0.143 | 0.154 | 0.156 | 0.151 | 0.147 | 0.152 | 0.147 | 0.142 | 0.137 | 0.149 |
M2FM3KH | 0.520 | 0.643 | 0.713 | 0.723 | 0.640 | 0.177 | 0.152 | 0.163 | 0.194 | 0.287 |
DPSH | 0.896 | 0.969 | 0.971 | 0.968 | 0.971 | 0.524 | 0.527 | 0.582 | 0.590 | 0.601 |
Method | IRMA (MAP) | ||||
8 bits | 12 bits | 24 bits | 32 bits | 48 bits | |
MFKH | 0.235 | 0.261 | 0.252 | 0.241 | 0.247 |
SH | 0.104 | 0.116 | 0.145 | 0.142 | 0.139 |
SKLSH | 0.116 | 0.123 | 0.131 | 0.225 | 0.321 |
KLSH | 0.132 | 0.143 | 0.147 | 0.145 | 0.154 |
M2FM3KH | 0.231 | 0.264 | 0.367 | 0.423 | 0.243 |
DPSH | 0.681 | 0.713 | 0.727 | 0.744 | 0.757 |