As a guest user you are not logged in or recognized by your IP address. You have
access to the Front Matter, Abstracts, Author Index, Subject Index and the full
text of Open Access publications.
Unsupervised hashing aims to learn a compact binary hash code to represent complex image content without label information. Existing deep unsupervised hashing methods typically first employ extracted image embeddings to construct semantic similarity structures and then map the images into compact hash codes while preserving the semantic similarity structure. However, the limited representation power of embeddings in Euclidean space and the inadequate exploration of the similarity structure in current methods often result in poorly discriminative hash codes. In this paper, we propose a novel method called Hyperbolic Multi-Structure Hashing (HMSH) to address these issues. Specifically, to increase the representation power of embeddings, we propose to map embeddings from Euclidean space to hyperbolic space and use the similarity structure constructed in hyperbolic space to guide hash learning. Meanwhile, to fully explore the structural information, we investigate four kinds of data structures, including local neighborhood structure, global clustering structure, inter/intra-class variation and variation under perturbation. Different data structures can complement each other, which is beneficial for hash learning. Extensive experimental results on three benchmark image datasets show that HMSH significantly outperforms state-of-the-art unsupervised hashing methods for image retrieval.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.