1.1. Integration with your
favourite IDE
We could now go ahead and start playing with the code in a text editor, however this really isn’t recommended! Using a good Integrated Development Environment (IDE) with auto-completion will make your experience much better.
Maven integrates with all the popular IDEs. The OpenIMAJ developers all use Eclipse so that is what we’re most familiar with, however we should be able to help getting it set up in a different IDE if you wish.
Integration with Eclipse is quite simple. From the command line, inside the project directory, issue the command:
<strong>mvn eclipse:eclipse</strong>
This will generate Eclipse project files in the same directory. In Eclipse you can then import the project into the Eclipse workspace (File > import..., choose Existing projects into workspace, select the project directory, make sure Copy projects into workspace is unchecked, then click Finish). The project should then appear in the workspace and you’ll be able to look at the App.java file that was generated by the archetype.
IMPORTANT By default Eclipse doesn’t know about Maven and its repositories of jars. When you first import an OpenIMAJ project into Eclipse it will have errors. You can fix this by adding a new Java classpath variable (Eclipse > Preferences > Java > Build Path > Classpath Variables) called M2_REPO. The value of this variable is the location of your .m2/repository directory. For Unix systems this is usually found in your home directory, for Windows systems it is found in C:\Documents and Settings\\.
Chapter 5. SIFT and feature matchingIn this tutorial we’ll look at how to compare images to each other. Specifically, we’ll use a popular local feature descriptor called SIFT to extract some interesting points from images and describe them in a standard way. Once we have these local features and their descriptions, we can match local features to each other and therefore compare images to each other, or find a visual query image within a target image, as we will do in this tutorial.
Firstly, lets load up a couple of images. Here we have a magazine and a scene containing the magazine:
MBFImage query = ImageUtilities.readMBF(new URL("http://dl.dropbox.com/u/8705593/query.jpg"));
MBFImage target = ImageUtilities.readMBF(new URL("http://dl.dropbox.com/u/8705593/target.jpg"));
The first step is feature extraction. We’ll use the difference-of-Gaussian feature detector which we describe with a SIFT descriptor. The features we find are described in a way which makes them invariant to size changes, rotation and position. These are quite powerful features and are used in a variety of tasks. The standard implementation of SIFT in OpenIMAJ can be found in the DoGSIFTEngine class:
DoGSIFTEngine engine = new DoGSIFTEngine();
LocalFeatureList<Keypoint> queryKeypoints = engine.findFeatures(query.flatten());
LocalFeatureList<Keypoint> targetKeypoints = engine.findFeatures(target.flatten());
Once the engine is constructed, we can use it to extract Keypoint objects from our images. The Keypoint class contain a public field called ivec which, in the case of a standard SIFT descriptor is a 128 dimensional description of a patch of pixels around a detected point. Various distance measures can be used to compare Keypoints to Keypoints.
The challenge in comparing Keypoints is trying to figure out which Keypoints match between Keypoints from some query image and those from some target. The most basic approach is to take a given Keypoint in the query and find the Keypoint that is closest in the target. A minor improvement on top of this is to disregard those points which match well with MANY other points in the target. Such point are considered non-descriptive. Matching can be achieved in OpenIMAJ using the BasicMatcher. Next we’ll construct and setup such a matcher:
LocalFeatureMatcher<Keypoint> matcher = new BasicMatcher<Keypoint>(80);
matcher.setModelFeatures(queryKeypoints);
matcher.findMatches(targetKeypoints);
We can now draw the matches between these two images found with this basic matcher using the MatchingUtilities class:
MBFImage basicMatches = MatchingUtilities.drawMatches(query, target, matcher.getMatches(), RGBColour.RED);
DisplayUtilities.display(basicMatches);
As you can see, the basic matcher finds many matches, many of which are clearly incorrect. A more advanced approach is to filter the matches based on a given geometric model. One way of achieving this in OpenIMAJ is to use a ConsistentLocalFeatureMatcher which given an internal matcher, a geometric model and a model fitter, finds which matches given by the internal matcher are consistent with respect to the model and are therefore likely to be correct.
To demonstrate this, we’ll use an algorithm called Random Sample Consensus (RANSAC) to fit a geometric model called an Affine transform to the initial set of matches. This is achieved by iteratively selecting a random set of matches, learning a model from this random set and then testing the remaining matches against the learnt model.
TipAn Affine transform models the transformation between two parallelograms.
We’ll now set up our model, our RANSAC model fitter and our consistent matcher:
AffineTransformModel fittingModel = new AffineTransformModel(5);
RANSAC<Point2d, Point2d> ransac =
new RANSAC<Point2d, Point2d>(fittingModel, 1500, new RANSAC.PercentageInliersStoppingCondition(0.5), true);
matcher = new ConsistentLocalFeatureMatcher2d<Keypoint>(
new FastBasicKeypointMatcher<Keypoint>(8), ransac);
matcher.setModelFeatures(queryKeypoints);
matcher.findMatches(targetKeypoints);
MBFImage consistentMatches = MatchingUtilities.drawMatches(query, target,
matcher.getMatches(), RGBColour.RED);
DisplayUtilities.display(consistentMatches);
The AffineTransformModel class models a two-dimensional Affine transform in OpenIMAJ. An interesting byproduct of this technique is that the AffineTransformModel contains the best transform matrix to go from the query to the target. We can take advantage of this by transforming the bounding box of our query with the transform estimated in the AffineTransformModel, therefore we can draw a polygon around the estimated location of the query within the target:
target.drawShape(query.getBounds().transform(fittingModel.getTransform().inverse()), 3,RGBColour.BLUE);
DisplayUtilities.display(target);
5.1. Exercises
5.1.1. Exercise 1: Different matchers
Experiment with different matchers; try the BasicTwoWayMatcher for example.
5.1.2. Exercise 2: Different models
Experiment with different models (such as a HomographyModel) in the consistent matcher.