Skip to content

Face Comparison

The Face SDK provides a convenient and powerful way to compare two or more images and find out whether the face in the images belongs to the same person.

Face comparison can be performed both on the server side and the client side. Navigate to the Client-Side Comparison page for detailed information on using the latter.

The code snippet below demonstrates matching two facial images:

let images = [
    MatchFacesImage(image: UIImage(named: "firstImage")!, imageType: .printed),
    MatchFacesImage(image: UIImage(named: "secondImage")!, imageType: .printed),
]
let request = MatchFacesRequest(images: images)

FaceSDK.service.matchFaces(request, completion: { response in
    // ... check response.results for results with score and similarity values.
})
NSArray<RFSMatchFacesImage *> *matchRequestImages = @[
    [[RFSMatchFacesImage alloc] initWithImage:[UIImage imageNamed:@"firstImage"] imageType:RFSImageTypePrinted],
    [[RFSMatchFacesImage alloc] initWithImage:[UIImage imageNamed:@"secondImage"] imageType:RFSImageTypePrinted],
];
RFSMatchFacesRequest *request = [[RFSMatchFacesRequest alloc] initWithImages:matchRequestImages];

[RFSFaceSDK.service matchFaces:request completion:^(RFSMatchFacesResponse * _Nullable response) {
    // ... check response.results for results with score and similarity values.
}];
val request = MatchFacesRequest(
    listOf(
        MatchFacesImage(firstImageBitmap, ImageType.IMAGE_TYPE_PRINTED),
        MatchFacesImage(secondImageBitmap, ImageType.IMAGE_TYPE_PRINTED),
    )
)

FaceSDK.Instance().matchFaces(request) { response ->
    // ... check response.results for results with score and similarity values.
}
List<MatchFacesImage> images = Arrays.asList(
    new MatchFacesImage(firstImageBitmap, ImageType.IMAGE_TYPE_PRINTED),
    new MatchFacesImage(secondImageBitmap, ImageType.IMAGE_TYPE_PRINTED)
);
MatchFacesRequest request = new MatchFacesRequest(images);

FaceSDK.Instance().matchFaces(request, response -> {
    // ... check response.getResults() for results with score and similarity values.
});
const firstImage = new MatchFacesImage();
firstImage.imageType = ImageType.IMAGE_TYPE_PRINTED;
firstImage.bitmap = firstImageBitmapAsBase64String;

const secondImage = new MatchFacesImage();
secondImage.imageType = ImageType.IMAGE_TYPE_PRINTED;
secondImage.bitmap = secondImageBitmapAsBase64String;

const request = new MatchFacesRequest();
request.images = [firstImage, secondImage];

FaceSDK.matchFaces(JSON.stringify(request), matchFacesResponse => {
    const response = MatchFacesResponse.fromJson(JSON.parse(matchFacesResponse));
    // ... check response.results for results with score and similarity values.
}, e => { });
var firstImage = MatchFacesImage();
firstImage.imageType = ImageType.IMAGE_TYPE_PRINTED;
firstImage.bitmap = firstImageBitmapAsBase64String;

var secondImage = MatchFacesImage();
secondImage.imageType = ImageType.IMAGE_TYPE_PRINTED;
secondImage.bitmap = secondImageBitmapAsBase64String;

var request = MatchFacesRequest();
request.images = [firstImage, secondImage];
FaceSDK.matchFaces(jsonEncode(request)).then((matchFacesResponse) {
    var response = MatchFacesResponse.fromJson(json.decode(matchFacesResponse));
    // ... check response?.results for results with score and similarity values.
});
const firstImage = new FaceSDK.MatchFacesImage();
firstImage.imageType = FaceSDK.Enum.ImageType.IMAGE_TYPE_PRINTED;
firstImage.bitmap = firstImageBitmapAsBase64String;

const secondImage = new FaceSDK.MatchFacesImage();
secondImage.imageType = FaceSDK.Enum.ImageType.IMAGE_TYPE_PRINTED;
secondImage.bitmap = secondImageBitmapAsBase64String;

const request = new FaceSDK.MatchFacesRequest();
request.images = [firstImage, secondImage];

FaceSDK.matchFaces(JSON.stringify(request), matchFacesResponse => {
    const response = FaceSDK.MatchFacesResponse.fromJson(JSON.parse(matchFacesResponse));
    // ... check response.results for results with score and similarity values.
}, e => { });
const firstImage = new MatchFacesImage();
firstImage.imageType = ImageType.IMAGE_TYPE_PRINTED;
firstImage.bitmap = firstImageBitmapAsBase64String;

const secondImage = new MatchFacesImage();
secondImage.imageType = ImageType.IMAGE_TYPE_PRINTED;
secondImage.bitmap = secondImageBitmapAsBase64String;

const request = new MatchFacesRequest();
request.images = [firstImage, secondImage];

FaceSDK.matchFaces(JSON.stringify(request)).then(matchFacesResponse => {
    const response = MatchFacesResponse.fromJson(JSON.parse(matchFacesResponse));
    // ... check response.results for results with score and similarity values.
}, e => { });

Warning

imageType provides information about the source of the image and influences similarity of matching results.

Request

The MatchFacesRequest instance represents all the parameters required for a face matching operation.

  • The images array of input images bundled in the MatchFacesImage type.
  • The boolean thumbnails property defines whether the response detections should contain thumbnailImage or not.

MatchFacesImage

The MatchFacesImage is a collection of input-image information that configures how the matching and detection are performed.

  • The image property is a platform-specific representation of an image.
  • The imageType provides information about the source of the image and influences the similarity of matching results.
  • The detectAll boolean property defines whether the comparison and detection should apply for all faces found on the image. When set to false, only the most centered faces are compared and detected.

Response

The MatchFacesResponse contains the results array of the ComparedFacesPair type and the detections array of the MatchFacesDetection type.

ComparedFacesPair

Each ComparedFacesPair is an attempt to compare input images with the resulting similarity value or error describing what went wrong. The ComparedFacesPair has two MatchFacesComparedFace that were used for matching.

MatchFacesComparedFace

The compared face contains a reference back to the input image in a form of MatchFacesImage as well as the reference to the MatchFacesDetectionFace that contains information related to the detection. Both references can be accessed with the appropriate indices such as imageIndex and faceIndex.

MatchFacesDetectionFace

The MatchFacesDetectionFace covers landmarks (eyes, nose, lips, ears, etc.) and a rectangular area of the detected face in the original image. The detection may contain a thumbnailImage property with the small face image automatically cropped from the original input image. See the thumbnails boolean property of the MatchFacesRequest.

Info

For more information on MatchFacesRequest and MatchFacesResponse, please see SDK Reference.

Similarity Threshold Split

The MatchFacesSimilarityThresholdSplit allows you to specify a threshold of the similarity of faces and split the match faces result into two separate groups. If the similarity of faces is greater or equal to the threshold, faces will be written to the matchedFaces array. Otherwise, to the unmatchedFaces one.

let split = MatchFacesSimilarityThresholdSplit(pairs: response.results, bySimilarityThreshold: 0.75)
// ... check split.matchedFaces for the matched faces.
// ... check split.unmatchedFaces for unmatched faces.
RFSMatchFacesRequest *request = [RFSMatchFacesSimilarityThresholdSplit splitPairs:response.results bySimilarityThreshold:@0.75];
// ... check split.matchedFaces for the matched faces.
// ... check split.unmatchedFaces for unmatched faces.
MatchFacesSimilarityThresholdSplit split = new MatchFacesSimilarityThresholdSplit(response.getResults(), 0.75);
// ... check split.getMatchedFaces() for the matched faces.
// ... check split.getUnmatchedFaces() for the matched faces.
val split = MatchFacesSimilarityThresholdSplit(response.results, 0.75)
// ... check split.matchedFaces for the matched faces.
// ... check split.unmatchedFaces for the matched faces.
FaceSDK.matchFacesSimilarityThresholdSplit(JSON.stringify(response.results), 0.75, splitResponse => {
    const split = MatchFacesSimilarityThresholdSplit.fromJson(JSON.parse(splitResponse))
    // ... check split.matchedFaces for the matched faces.
    // ... check split.unmatchedFaces for the matched faces.
}, e => { })
FaceSDK.matchFacesSimilarityThresholdSplit(jsonEncode(response.results), 0.75).then((splitResponse) {
    var split = Regula.MatchFacesSimilarityThresholdSplit.fromJson(json.decode(splitResponse));
    // ... check split.matchedFaces for the matched faces.
    // ... check split.unmatchedFaces for the matched faces.
});
FaceSDK.matchFacesSimilarityThresholdSplit(JSON.stringify(response.results), 0.75, splitResponse => {
    const split = FaceSDK.MatchFacesSimilarityThresholdSplit.fromJson(JSON.parse(splitResponse))
    // ... check split.matchedFaces for the matched faces.
    // ... check split.unmatchedFaces for the matched faces.
}, e => { })
FaceSDK.matchFacesSimilarityThresholdSplit(JSON.stringify(response.results), 0.75).then(splitResponse => {
    const split = MatchFacesSimilarityThresholdSplit.fromJson(JSON.parse(splitResponse))
    // ... check split.matchedFaces for the matched faces.
    // ... check split.unmatchedFaces for the matched faces.
});