Skip to content

Match Faces

The Face SDK provides a convenient and powerful way to compare two or more images and find out how similar faces are!

Request

The code snippet below demonstrates matching two images:

Warning

Declaring imageType as printed is required for a correct comparison. It influences the matching results.

let images = [
    Image(image: UIImage(named: "firstImage")!, type: .printed),
    Image(image: UIImage(named: "secondImage")!, type: .printed),
]
let request = MatchFacesRequest(images: images)

FaceSDK.service.matchFaces(request, completion: { response in
    // ... check response.matchedFaces for similar faces.
})
NSArray<RFSImage *> *matchRequestImages = @[
    [[RFSImage alloc] initWithImage:[UIImage imageNamed:@"firstImage"] type:RFSImageTypePrinted],
    [[RFSImage alloc] initWithImage:[UIImage imageNamed:@"secondImage"] type:RFSImageTypePrinted],
];
RFSMatchFacesRequest *request = [[RFSMatchFacesRequest alloc] initWithImages:matchRequestImages];

[RFSFaceSDK.service matchFaces:request completion:^(RFSMatchFacesResponse * _Nullable response) {
    // ... check response.matchedFaces for similar faces.
}];
val request = MatchFacesRequest(
    listOf(
        Image(ImageType.IMAGE_TYPE_PRINTED, firstImageBitmap),
        Image(ImageType.IMAGE_TYPE_PRINTED, secondImageBitmap),
    )
)

FaceSDK.Instance().matchFaces(request) { response ->
    // ... check response.matchedFaces for similar faces.
}
List<Image> images = Arrays.asList(
    new Image(ImageType.IMAGE_TYPE_PRINTED, firstImageBitmap),
    new Image(ImageType.IMAGE_TYPE_PRINTED, secondImageBitmap)
);
MatchFacesRequest request = new MatchFacesRequest(images);

FaceSDK.Instance().matchFaces(request, response -> {
    // ... check response.getMatchedFaces() for similar faces.
});
const firstImage = new Image();
firstImage.imageType = ImageType.IMAGE_TYPE_PRINTED;
firstImage.bitmap = firstImageBitmapAsBase64String;

const secondImage = new Image();
secondImage.imageType = ImageType.IMAGE_TYPE_PRINTED;
secondImage.bitmap = secondImageBitmapAsBase64String;

const request = new MatchFacesRequest();
request.images = [firstImage, secondImage];

FaceSDK.matchFaces(JSON.stringify(request), matchFacesResponse => {
    const response = MatchFacesResponse.fromJson(JSON.parse(matchFacesResponse));
    // ... check response.matchedFaces for similar faces.
}, e => { });
var firstImage = Image();
firstImage.imageType = ImageType.IMAGE_TYPE_PRINTED;
firstImage.bitmap = firstImageBitmapAsBase64String;

var secondImage = Image();
secondImage.imageType = ImageType.IMAGE_TYPE_PRINTED;
secondImage.bitmap = secondImageBitmapAsBase64String;

var request = MatchFacesRequest();
request.images = [firstImage, secondImage];
FaceSDK.matchFaces(jsonEncode(request)).then((matchFacesResponse) {
    var response = MatchFacesResponse.fromJson(json.decode(matchFacesResponse));
    // ... check response?.matchedFaces for similar faces.
});
const firstImage = new FaceSDK.Image();
firstImage.imageType = FaceSDK.Enum.ImageType.IMAGE_TYPE_PRINTED;
firstImage.bitmap = firstImageBitmapAsBase64String;

const secondImage = new FaceSDK.Image();
secondImage.imageType = FaceSDK.Enum.ImageType.IMAGE_TYPE_PRINTED;
secondImage.bitmap = secondImageBitmapAsBase64String;

const request = new FaceSDK.MatchFacesRequest();
request.images = [firstImage, secondImage];

FaceSDK.matchFaces(JSON.stringify(request), matchFacesResponse => {
    const response = FaceSDK.MatchFacesResponse.fromJson(JSON.parse(matchFacesResponse));
    // ... check response.matchedFaces for similar faces.
}, e => { });
const firstImage = new Image();
firstImage.imageType = ImageType.IMAGE_TYPE_PRINTED;
firstImage.bitmap = firstImageBitmapAsBase64String;

const secondImage = new Image();
secondImage.imageType = ImageType.IMAGE_TYPE_PRINTED;
secondImage.bitmap = secondImageBitmapAsBase64String;

const request = new MatchFacesRequest();
request.images = [firstImage, secondImage];

FaceSDK.matchFaces(JSON.stringify(request)).then(matchFacesResponse => {
    const response = MatchFacesResponse.fromJson(JSON.parse(matchFacesResponse));
    // ... check response.matchedFaces for similar faces.
}, e => { });

MatchFacesRequestallows you to specify a threshold of the similarity of faces. If the similarity of faces is greater or equal to the threshold, faces will be written to the matchedFaces array. Otherwise, to the unmatchedFaces one.

The threshold value defaults to 0.75, but you can change it yourself if needed.

let request = MatchFacesRequest(images: images)
request.similarityThreshold = 0.75
RFSMatchFacesRequest *request = [[RFSMatchFacesRequest alloc] initWithImages:images];
request.similarityThreshold = @0.75;
MatchFacesRequest request = new MatchFacesRequest(images, 0.75F);
val request = MatchFacesRequest(images, 0.75f)
const request = new MatchFacesRequest();
request.similarityThreshold = 0.75;
var request = MatchFacesRequest();
request.similarityThreshold = 0.75;
const request = new FaceSDK.MatchFacesRequest();
request.similarityThreshold = 0.75;
const request = new MatchFacesRequest();
request.similarityThreshold = 0.75;

Response

MatchFacesResponse contains arrays of matched and unmatched pairs. Each ComparedFacesPair represents an attempt to compare input images with resulting similarity value or error describing what went wrong.

Info

For more information on MatchFacesRequest and MatchFacesResponse, please see SDK Reference.

Back to top