evaluateImageRetrieval

Evaluate image search results

Description

example

averagePrecision = evaluateImageRetrieval(queryImage,imageIndex,expectedIDs) returns the average precision metric for measuring the accuracy of image search results for the queryImage. The expectedIDs input contains the indices of images within imageIndex that are known to be similar to the query image.

[averagePrecision,imageIDs,scores] = evaluateImageRetrieval(queryImage,imageIndex,expectedIDs) optionally returns the indices corresponding to images within imageIndex that are visually similar to the query image. It also returns the corresponding similarity scores.

[averagePrecision,imageIDs,scores] = evaluateImageRetrieval(___,Name,Value) uses additional options specified by one or more Name,Value pair arguments, using any of the preceding syntaxes.

Examples

collapse all

Define a set of images.

dataDir = fullfile(toolboxdir('vision'),'visiondata','bookCovers');
bookCovers = imageDatastore(dataDir);

Display the set of images.

thumbnailGallery = [];
for i = 1:length(bookCovers.Files)
    img = readimage(bookCovers,i); 
    thumbnail = imresize(img,[300 300]);
    thumbnailGallery = cat(4,thumbnailGallery,thumbnail);
end
figure
montage(thumbnailGallery);

Index the images. This will take a few minutes.

imageIndex = indexImages(bookCovers);
Creating an inverted image index using Bag-Of-Features.
-------------------------------------------------------

Creating Bag-Of-Features.
-------------------------

* Selecting feature point locations using the Detector method.
* Extracting SURF features from the selected feature point locations.
** detectSURFFeatures is used to detect key points for feature extraction.

* Extracting features from 58 images...done. Extracted 29216 features.

* Keeping 80 percent of the strongest features from each category.

* Balancing the number of features across all image categories to improve clustering.
** Image category 1 has the least number of strongest features: 23373.
** Using the strongest 23373 features from each of the other image categories.

* Using K-Means clustering to create a 20000 word visual vocabulary.
* Number of features          : 23373
* Number of clusters (K)      : 20000

* Initializing cluster centers...100.00%.
* Clustering...completed 18/100 iterations (~0.67 seconds/iteration)...converged in 18 iterations.

* Finished creating Bag-Of-Features


Encoding images using Bag-Of-Features.
--------------------------------------

* Encoding 58 images...done.
Finished creating the image index.

Select and display the query image.

queryDir = fullfile(dataDir,'queries',filesep);
query = imread([queryDir 'query2.jpg']);

figure
imshow(query)

Evaluation requires knowing the expected results. Here, the query image is known to be the 3rd book in the imageIndex.

expectedID = 3;

Find and report the average precision score.

[averagePrecision,actualIDs] = evaluateImageRetrieval(query,...
    imageIndex,expectedID);

fprintf('Average Precision: %f\n\n',averagePrecision)
Average Precision: 0.029412

Show the query and best match side-by-side.

bestMatch = actualIDs(1);
bestImage = imread(imageIndex.ImageLocation{bestMatch});

figure
imshowpair(query,bestImage,'montage')

Create an image set of book covers.

  dataDir = fullfile(toolboxdir('vision'),'visiondata','bookCovers');
  bookCovers = imageDatastore(dataDir);

Index the image set. The indexing may take a few minutes.

  imageIndex = indexImages(bookCovers,'Verbose',false);

Create a set of query images.

  queryDir = fullfile(dataDir,'queries',filesep);
  querySet = imageDatastore(queryDir);

Specify the expected search results for each query image.

  expectedIDs = [1 2 3];

Evaluate each query image and collect average precision scores.

  for i = 1:numel(querySet.Files)
      query = readimage(querySet,i);
      averagePrecision(i) = evaluateImageRetrieval(query, imageIndex, expectedIDs(i));
  end

Compute mean average precision (MAP).

  map = mean(averagePrecision)
map = 1

Input Arguments

collapse all

Input query image, specified as either an M-by-N-by-3 truecolor image or an M-by-N 2-D grayscale image.

Data Types: single | double | int16 | uint8 | uint16 | logical

Image search index, specified as an invertedImageIndex object. The indexImages function creates the invertedImageIndex object, which stores the data used for the image search.

Image indices, specified as a row or column vector. The indices correspond to the images within imageIndex that are known to be similar to the query image.

Name-Value Pair Arguments

Specify optional comma-separated pairs of Name,Value arguments. Name is the argument name and Value is the corresponding value. Name must appear inside quotes. You can specify several name and value pair arguments in any order as Name1,Value1,...,NameN,ValueN.

Example: 'NumResults',25

Maximum number of search results to evaluate, specified as the comma-separated pair consisting of 'NumResults' and a positive integer value. The function evaluates the top NumResults and returns the average-precision-at-NumResults metric.

Rectangular search region within the query image, specified as the comma-separated pair consisting of 'ROI' and a [x y width height] formatted vector.

Output Arguments

collapse all

Average precision metric, returned as a scalar value in the range [0 1]. The average precision metric represents the accuracy of image search results for the query image.

Ranked index of retrieved images, returned as an M-by-1 vector. The image IDs are returned in ranked order, from the most to least similar matched image.

Similarity metric, returned as an N-by-1 vector. This output contains the scores that correspond to the retrieved images in the imageIDs output. The scores are computed using the cosine similarity and range from 0 to 1.

Introduced in R2015a