Using High Dimensional Indexes to Support Relevance Feedback Based Interactive Images Retrieval∗
Image retrieval has found more and more applications. Due to the well recognized semantic gap problem, the accuracy and the recall of image similarity search are often still low. As an effective method to improve the quality of image retrieval, the relevance feedback approach actively applies users’ feedback to refine the search. As searching a large image database is often costly, to improve the efficiency, high dimensional indexes may help. However, many existing database indexes are not adaptive to updates of distance measures caused by users’ feedback. In this paper, we propose a demo to illustrate the relevance feedback based interactive images retrieval procedure, and examine the effectiveness and the efficiency of various indexes. Particularly, audience can interactively investigate the effect of updated distance measures on the data space where the images are supposed to be indexed, and on the distributions of the similar images in the indexes. We also introduce our new B+-tree-like index method based on cluster splitting and iDistance.