Skip to content
lcdurann edited this page Jul 13, 2020 · 15 revisions

Welcome to the Correlation-Techniques-for-Face-Recognition wiki!

This project aims at studying correlation techniques for facial recognition. We explore the capabilities of correlation techniques as a non-segmentation alternative for biometric recognition. The use of correlation techniques has been historically developed in the context of optical correlators. However, in this project a digital implementation of correlation is considered as an illustration of the scope of these techniques. A simple application that performs identity verification and a simple threshold-based decision algorithm are developed using MATLAB. By means of a ROC curve analysis, parameters of the decision are chosen to achieve a confidence higher than 80% in the performance of the application under ideal conditions of illumination.

Objectives of the project

The main objective of the project is to design and implement a correlation-based static facial verification algorithm. That is, an algorithm that automates the recognition of a person using their face, with an accuracy higher than 80%. In order to achieve that goal, the following specific objectives were set:

  • Determine an image database acquisition protocol that allows synthesis of a robust linear filter in terms of intensity and noise in the image.
  • Develop an image acquisition routine that automates this part of the process.
  • Implement a robust algorithm for filter design and synthesis, both for MACE and HBCOM filters.
  • Implement normalization metrics that allow a correlation peak characterization and a confidence of 80% in facial verification.

Fundamentals of Correlation Techniques

The basic approach of correlation techniques is as follows:

  1. A set of carefully chosen training images is used to design a correlation filter that embodies the main features of the biometric of interest in a variety of scenarios.

  2. The correlation between the filter and a test image is computed in image frequency space. The output is known as correlation plane.

  3. The intensity pattern in the correlation plane is used to perform a decision concerning the recognition of the biometric of interest. If the pattern presents a sharp enough peak, the test image is said to correspond to the true class. This means that the biometric is likely to be found in the test image and the peak location is an indicator of the location of the biometric in the image plane. On the other hand, if the plane does not present a visible peak, the biometric is likely to be missing in the test image, and it is said to belong to the false class. This decision process is known as biometric matching.

As is shown in the figure above, the approach has two major parts: a training stage, in which a carefully designed template is built using an appropriate set of training images, and a recognition stage in which a decision is made based on the correlation output. Concerning the training stage, the present application is concerned with linear filters, which are those that build the template using a linear combination of the training images. Furthermore, the biometric of interest is the facial area of a subject, avoiding rotations of any type, but allowing arbitrary variations of the facial expression. In terms of the recognition stage, the correlation computation is straightforward. The decision algorithm is based on the PSR metric, using a ROC curve analysis.

Previous work

Historically, correlation filters have been implemented using optical correlators based on the 4f configuration, a JTC configuration or a Vander Lugt 4f configuration. The essential principle is the capacity of lenses to compute the Fourier Transform of a diffraction pattern in their back focal plane. Usually, the optical set up is complemented with optoelectronical devices capable of storing several correlation filters for biometric recognition, which when located at the Fourier plane, are correlated with a query image using a second lens.

Above figure taken from New Perspectives in face Correlation Research: A Tutorial, from Wang et. al. The first application of a correlation filter was a Matched Filter, which consisted in the Fourier Spectrum of a single training image. However, this filter is quite sensitive to small distortions of the biometric feature of interest. In 1980, the linear filters began to appear with the Equal Correlation Peak Synthetic Discriminant Function (ECPSDF) filter. The linear filters are a linear superposition of a set of training images with carefully chosen coefficients that improve the recognition rate performance. Among them there are some in which the peak size is fixed for images in the training set. Those are called constrained. Others have arbitrary peak sizes and are called unconstrained. In this project the efforts are directed to the implementation of two linear filters for facial recognition: HBCOM filters and MACE filters.

Concerning the MACE filters, the work of Kumar et. al. has shown that MACE filters have significant potential for face verification systems. In their computer simulations, a database of 13 subjects, allowing variations of the facial expression, was captured. Their investigation suggests that MACE filters may perform as good as other methods like the individual eigenface subspace method, using only 3 training images. In addition to this work, that of Omidiora et. al. shows the capabilities of the MACE filters. Even though his demonstration used IR imaging, and presented sensitivity to image noise, the ratio of correct matches was above 80%, with a false positive ratio bellow 10%. In the spirit of those works, this project intends to show the potential of linear filters for facial recognition in a simple application.

Clone this wiki locally