Hashing is an effective method to retrieve similar images from a large scale database. However, a single hash table requires searching an exponentially increasing number of hash buckets with large Hamming distance for a better recall rate which is time consuming. The union of results from multiple hash tables (multi-hashing) yields a high recall but low precision rate with exact hash code matching. Methods using image filtering to reduce dissimilar images rely on Hamming distance or hash code difference between query and candidate images. However, they treat all hash buckets to be equally important which is generally not true. Different buckets may return different number of images and yield different importance to the hashing results. We propose two descriptors, bucket sensitivity measure and location sensitivity measure, to score both the hash bucket and the candidate images that it contains using a location-based sensitivity measure. A radial basis function neural network (RBFNN) is trained to filter dissimilar images based on the Hamming distance, hash code difference, and the two proposed descriptors. Since the Hamming distance and the hash code difference are readily computed by all hashing-based image retrieval methods, and both the RBFNN and the two proposed sensitivity-based descriptors are computed offline when hash tables become available, the proposed sensitivity based image filtering method is efficient for a large scale image retrieval. Experimental results using four large scale databases show that the proposed method improves precision at the expense of a small drop in the recall rate for both data-dependent and data-independent multi-hashing methods as well as multi-hashing combining both types.