combination of classical algorithms

The modifications are as follows:

  • We will go around the graphs G_T And G_P simultaneously.

    Let's go back to the definition of the intermediate representation of a fingerprint. It is an array containing a “K-plet” for each of the minutiae. In turn, a “K-plet” is an ordered list of adjacent vertices. In fact, the graphs being compared are defined by ordered lists of adjacent vertices (adjucency-list representation). In this case, the DFS algorithm, when traversing two graphs for fingerprints of the same finger, will simultaneously visit vertices representing the same minutia.

  • The DFS algorithm will visit only those pairs of vertices for which we have verified that such pairs correspond to the same minutia on the two compared fingerprints.

Then the comparison algorithm will look like this:

Input data:

Output data: m – the number of pairs of vertices in graphs G_T And G_Pwhich represent the same minutia on two prints.

Directly the comparison algorithm:

  1. We color all the vertices v^t \in G_T And v^p \in G_P in white;

  2. We put the initial pair of vertices into the stack ({v^t}_i, {v^p}_j) and paint these vertices gray;

  3. m ← 1

  4. While the stack is not empty, we extract the top pair of vertices ({v^t}_k, {v^p}_l):

    4.1. Compare two “K-plets” for vertices {v^t}_k And {v^p}_l; the result of the comparison is a list of pairs of adjacent vertices that represent the same minutia (in other words, they formed pairs);

    4.2. From the resulting list of pairs of adjacent vertices, we put into the stack only those pairs in which the vertices are white; we denote the number of such pairs by q; we repaint the vertices in these pairs in gray;

    4.3. m ← m + q

  5. We paint the peaks {v^t}_k And {v^p} in black color;

  6. The resulting number of matching pairs m We use the similarity coefficient to calculate it.

The accuracy of the algorithm described above depends on how step 4.1 is implemented – comparison of local structures of minutiae (two “K-plets”).

Comparison of local structures of minutiae

So, “K-plet” is an ordered list of adjacent nodes. The result of comparing two “K-plet” is a list of pairs of adjacent nodes that form pairs (represent the same minutia). This means that the problem of comparing two “K-plet” can be represented as a problem of finding the Longest Common Subsequence (LCS).[7,9].

The LCS problem is to find the longest common subsequence for two sequences. X = {x_0, x_1, ..., x_m} And Y = {y_0, y_1, ... , y_n}This problem is solved by applying dynamic programming.

In this case, the algorithm for solving the LCS problem will consist of two consecutive stages:

  • Stage 1which following [7] let's call it “LCS-Length”. The result of this step is the length of the longest common subsequence, as well as the data needed to construct the longest common subsequence;

  • Stage 2 (“Print-LCS”), which results in the longest common subsequence.

Following [7]we will give pseudocode for “LCS-Length” and “Print-LCS”.

LCS-Length(X,Y)

for i ← 0 to m
    do c[i,0] ← 0
for j ← 0 to n
    do c[0,j] ← 0
for i ← 1 to m
    do for j ← 1 to n
           do if IsEqual(X[i-1], Y[j-1])
                 then c[i,j] ← c[i-1,j-1] + MatchingWeight(X[i-1], Y[j-1])
                      b[i,j] ← "↖"
                 else if c[i-1,j] ≥ c[i,j-1]
                         then c[i,j] ← c[i-1,j]
                              b[i,j] ← "↑"
                         else c[i,j] ← c[i,j-1]
                              b[i,j] ← "←"
return c,b

Here c[0..m,0..n] – a two-dimensional array for storing the cost of the generated largest common subsequence, b[1..m,1..n] – a two-dimensional array that preserves the “origin” c[i,j]; this information is necessary for the subsequent construction of the longest common subsequence:

Print-LCS(b, X, Y)

l ← {}
i ← m
j ← n
while i > 0 and j > 0
      do if b[i,j] = "↖"
            then l ← l + {(X[i-1],Y[j-1])}
                 i ← i-1
                 j ← j-1
            elseif b[i,j] = "↑"
                   then i ← i-1
            else j ← j-1
return l

Here l – a list of pairs of adjacent vertices that formed pairs.

If the sequences being compared X And Y were normal strings, then the “IsEqual(x,y)” function, which is called on line 7 of the “LCS-Length” function, would turn into a simple equality check x = y. And the function “MatchingWeight(x,y)”, called on line 8, would always return 1. In this case, the code of the function “LCS-Length” will completely coincide with the code given in the book [7].

In our case, we compare two “K-plets”, where each “symbol” is a triple of numbers (the Euclidean distance from the central minutia; the angle formed by the line passing through the current minutia and the central minutia; and the direction of the current minutia relative to the direction of the central minutia).

Then the “IsEqual(x,y)” function will turn into a check of the following conditions:

\Delta_r = |r_x - r_y|,\Delta_{\alpha} = \min(|\alpha_x - \alpha_y|, 360^0 - |\alpha_x - \alpha_y|),\Delta_{\theta} = \min(|\theta_x - \theta_y|, 360^0 - |\theta_x - \theta_y|),\begin{cases}\Delta_r ≥ r_{thr}, \\ \Delta_{\alpha} \leq \alpha_{thr}, \\ \Delta_{\theta} \leq \theta_{thr}.\end{cases}

Here x = (r_x, \alpha_x, \theta_x) And y = (r_y, \alpha_y, \theta_y) – two compared minutiae. Threshold values r_{thr}, \alpha_{thr} And \theta_{thr} are selected based on the training sample and actually define the tolerance box, which was mentioned in the section devoted to setting the problem of comparing fingerprints.

The function “MatchingWeight(x,y)” should reflect the degree of coincidence of the compared minutions x And yThe logical solution would be to calculate the weight Wwhich the function “MatchingWeight(x,y)” should return like this:

W = a*\Delta_r + b*\Delta_{\alpha} + c*\Delta_{\theta} + d,

where are the coefficients a, b, c And d are also selected empirically based on the training sample.

Before we move on, we need to make one more change to the algorithm of the “LCS-Length” function. It is necessary that the longest common subsequence does not contain pairs of minutiae that cannot form pairs.

Let us highlight two such conditions:

  • pairs can only form vertices (minutiae) of white color. This condition becomes clear if we return to the description of the comparison algorithm in the previous section. In step 4.2, we recolor all vertices that formed pairs as a result of the comparison algorithm of two “K-plets”;

  • a pair cannot be formed by vertices (minutions) for which the “IsEqual” function returns the value False.

In the case of the above conditions, c[i,j] we enter the penalty value FALSE_WEIGHT, equal to the maximum possible value Wtaken with a minus sign. In the case of the above formula for calculating WFALSE_WEIGHT =-d.

Then the modified “LCS-Length” function (let's call it “LCS-Length-Mod”) will look like this:

LCS-Length-Mod(X,Y)

for i ← 0 to m
    do c[i,0] ← 0
for j ← 0 to n
    do c[0,j] ← 0
for i ← 1 to m
    do for j ← 1 to n
           do if "X[i-1] белая" and "Y[i-1] белая" and IsEqual(X[i-1], Y[j-1])
                 then c[i,j] ← c[i-1,j-1] + MatchingWeight(X[i-1], Y[j-1])
              else c[i,j] ← c[i-1,j-1] + FALSE_WEIGHT

              if c[i,j] > c[i-1,j] and c[i,j] > c[i,j-1]
                 then b[i,j] ← "↖"
              elseif c[i-1,j] > c[i,j-1] and c[i-1,j] > c[i,j]
                     then c[i,j] ← c[i-1,j]
                          b[i,j] ← "↑"
              else c[i,j] ← c[i,j-1]
                   b[i,j] ← "←"
return c,b

Let's return to the description of the comparison algorithm in the previous section. In step 4.1, we successively call the functions “LCS-Length-Mod” and “Print-LCS” to obtain a list of pairs of adjacent vertices as a result of comparing two “K-plets”. And in step 4.2, we need to go through the obtained list of pairs of adjacent vertices in order to put the found pairs into a stack for subsequent processing, and also count the number of such pairs for subsequent determination of the similarity coefficient for the compared fingerprints.

To eliminate the need to re-traverse the list of adjacent vertex pairs in step 4.2, we can perform the actions of step 4.2 while constructing the list of adjacent vertex pairs in the Print-LCS function. The modified Print-LCS function (let's call it Print-LCS-Mod):

Print-LCS-Mod(b, X, Y)

q ← 0
i ← m
j ← n
while i > 0 and j > 0
      do if b[i,j] = "↖"
            then if "X[i-1] белая" and "Y[i-1] белая"
                    then "помещаем в стек пару (X[i-1],Y[j-1])"
                         "перекрашиваем вершины X[i-1] и Y[j-1] в серый цвет"
                         q ← q+1
                 i ← i-1
                 j ← j-1
         elseif b[i,j] = "↑"
                then i ← i-1
         else j ← j-1
return q

According to the calculations given in the article [5]the complexity of the resulting algorithm is equal to O(n)Where n – number of minutiae.

The operation of the constructed comparison algorithm for our example:

Figure 9. An example of the comparison algorithm.

Figure 9. An example of the comparison algorithm.

The question of choosing the initial pair of minutiae, which is necessary to start the comparison algorithm described above, remains behind the scenes. But since we initially know nothing about which vertices of the compared graphs represent the same minutiae (or, if we consider the problem more broadly, whether the compared fingerprints belong to the same finger), we can choose the following strategies:

  • We will consider all possible pairs of vertices of the compared graphs as the initial pair of vertices. In this case, the comparison algorithm described above will be launched m \times n times, and the maximum of the resulting similarity coefficients will be selected as the resulting similarity coefficient. Thus, the complexity of such an algorithm will be O(n^3).

  • One of the conditions for successful fingerprint comparison is a sufficient overlap area: the fingerprints being compared must contain the same area of ​​the finger, sufficient for a reliable comparison of the area. Since we compare fingerprints based on the minutiae allocated to them, the above condition turns into a condition for finding a sufficient number of minutiae on the fingerprints. In this case, in order for the result of fingerprint comparison to be reliable, it is necessary to find at least 12 pairs of matching minutiae on the fingerprints being compared. This rule is called the “12-point guideline” [1].

    If the prints belong to the same finger and, at the same time, have a common overlap area, then sorting the list of minutiae in ascending order of their mutual distance from the center of gravity for the compared prints (steps 1 and 2 of the algorithm for forming an intermediate representation of the print) will result in the minutiae that are present on both prints appearing at the beginning of the list. Then we can take c the first minutiae from the two lists and consider them as initial pairs (i.e. we will run the comparison algorithm with \times with times). In this case, the complexity of such an algorithm will be as follows: O(c^2n) = O(n).

As for the achieved recognition accuracy for the described algorithm, curious readers can find this information directly in the article. [5].

Practical implementation of fingerprint comparison algorithm

Since in the previous note the presence of pseudocode was the reason for accusing the author of not actually implementing anything, below are fragments of the code implementing the described comparison algorithm. I would like to say right away that the code is written in the old C language and can be written more optimally from all points of view.

First, let's introduce the following data structures:

#define MAX_RAY_AMOUNT    8

typedef struct _ray_element
{
    int32_t neighbour_minutiae;
    int32_t rel_dist;
    int32_t rel_angle;
    int32_t rel_theta;
} RAY_ELEMENT, *P_RAY_ELEMENT;

typedef struct _minutiae_element
{
    uint32_t ray_amount;
    RAY_ELEMENT rays[MAX_RAY_AMOUNT];
} MINUTIA_ELEMENT, *P_MINUTIA_ELEMENT;

typedef struct _fp_feature_vector
{
    uint32_t minutiae_amount;
    MINUTIA_ELEMENT minutiae_data[1];
} FP_FEATURE_VECTOR, *P_FP_FEATURE_VECTOR;

Then we can allocate memory for the intermediate representation of the fingerprint, on which minutiae_amount minutiae are allocated:

P_FP_FEATURE_VECTOR p_feature_vector =
      (P_FP_FEATURE_VECTOR)calloc(sizeof(FP_FEATURE_VECTOR) + 
                                  (minutiae_amount-1) * sizeof(MINUTIA_ELEMENT), 
                                  sizeof(uint8_t));

The following arrays will be required for the comparison algorithm to work:

#define MAX_STAR_MINUTIAE_AMOUNT    50
#define LCS_MAX_RAY_SIZE            (MAX_RAY_AMOUNT+1)
#define LCS_MAX_RAY_AMOUNT          (LCS_MAX_RAY_SIZE*LCS_MAX_RAY_SIZE)

int32_t LeftColorArray[MAX_STAR_MINUTIAE_AMOUNT];
int32_t RightColorArray[MAX_STAR_MINUTIAE_AMOUNT];
int32_t LeftStack[MAX_STAR_MINUTIAE_AMOUNT];
int32_t RightStack[MAX_STAR_MINUTIAE_AMOUNT];

int32_t CostArray[LCS_MAX_RAY_AMOUNT];
int32_t DirArray[LCS_MAX_RAY_AMOUNT];

Here we assume that the maximum number of minutiae identified on the print cannot exceed 50.

The function that implements the comparison algorithm:

// matching parameters
#define MIN_MINUTIAE_PAIR_THR 12

#define TRUE_WEIGHT       16
#define FALSE_WEIGHT     -16
   
#define STAR_DIST_THR    12
#define STAR_ALPHA_THR   20
#define STAR_THETA_THR   30

#define STAR_DIST_COEFF    2
#define STAR_ALPHA_COEFF   4
#define STAR_THETA_COEFF   6

// constants of DFS algorithm
#define WHITE_COLOR      0
#define GRAY_COLOR       1
#define BLACK_COLOR      2

#define STACK_EMPTY      (-1)

// for LCS algorithm
#define LEFT_TOP_DIR     0
#define LEFT_DIR         1
#define UP_DIR           2

// for search space dimension
#define SEARCH_SPACE_DIM    15

double PerformMatching(P_MATCHER_HANDLE pHandle, P_FP_FEATURE_VECTOR pLeftFV, P_FP_FEATURE_VECTOR pRightFV)
{
    int top_of_stack;
    int index1, index2;
    int index3, index4;
    int max_score, cur_score;
    int cur_template_root_minutiae,
        cur_input_root_minutiae;
    P_MINUTIA_ELEMENT pTemplateMinutiae = NULL;
    P_MINUTIA_ELEMENT pInputMinutiae    = NULL;

    // LCS algorithm variables
    int left_cost, up_cost, left_up_cost;
    int delta_dist;
    int delta_alpha;
    int delta_theta;

    // full search
    max_score = 0;

    memset(CostArray, 0, LCS_MAX_RAY_AMOUNT*sizeof(int32_t));

    for (index1 = 0; index1 < (pLeftFV->minutiae_amount < SEARCH_SPACE_DIM ? pLeftFV->minutiae_amount : SEARCH_SPACE_DIM); ++index1)
    {
        for (index2 = 0; index2 < (pRightFV->minutiae_amount < SEARCH_SPACE_DIM ? pRightFV->minutiae_amount : SEARCH_SPACE_DIM); ++index2)
        {
            // DFS algorithm
            memset((void*)LeftColorArray,  WHITE_COLOR, MAX_STAR_MINUTIAE_AMOUNT*sizeof(int32_t));
            memset((void*)RightColorArray, WHITE_COLOR, MAX_STAR_MINUTIAE_AMOUNT*sizeof(int32_t));

            LeftColorArray[index1]  = GRAY_COLOR;
            RightColorArray[index2] = GRAY_COLOR;

            top_of_stack = 0;

            LeftStack[top_of_stack] = index1;
            RightStack[top_of_stack]    = index2;

            cur_score = 1;

            while (STACK_EMPTY < top_of_stack)
            {
                cur_template_root_minutiae = LeftStack[top_of_stack];
                cur_input_root_minutiae    = RightStack[top_of_stack];
                --top_of_stack;

                // LCS algorithm
                pTemplateMinutiae = &pLeftFV->minutiae_data[cur_template_root_minutiae];
                pInputMinutiae    = &pRightFV->minutiae_data[cur_input_root_minutiae];

                // LCS-LENGTH
                for (index3 = 1; index3 < pTemplateMinutiae ->ray_amount + 1; ++index3)
                {
                    for (index4 = 1; index4 < pInputMinutiae ->ray_amount + 1; ++index4)
                    {
                        up_cost   = CostArray[LCS_MAX_RAY_SIZE*(index3 - 1) + index4];
                        left_cost = CostArray[LCS_MAX_RAY_SIZE*index3 + index4 - 1];

                        if ( LeftColorArray[pTemplateMinutiae->rays[index3 - 1].neighbour_minutiae] != WHITE_COLOR ||
                             RightColorArray[pInputMinutiae->rays[index4 - 1].neighbour_minutiae]   != WHITE_COLOR )
                            left_up_cost = CostArray[LCS_MAX_RAY_SIZE*(index3 - 1) + index4 - 1] + FALSE_WEIGHT;
                        else
                        {
                            delta_dist  = abs(pTemplateMinutiae->rays[index3 - 1].rel_dist   - pInputMinutiae->rays[index4 - 1].rel_dist);
                            delta_alpha =  abs(pTemplateMinutiae->rays[index3 - 1].rel_angle - pInputMinutiae->rays[index4 - 1].rel_angle);
                            delta_alpha = delta_alpha > 180 ? 360 - delta_alpha : delta_alpha;
                            delta_theta = abs(pTemplateMinutiae->rays[index3 - 1].rel_theta  - pInputMinutiae->rays[index4 - 1].rel_theta );
                            delta_theta = delta_theta > 180 ? 360 - delta_theta : delta_theta;

                            if ( delta_dist <= STAR_DIST_THR && delta_alpha <= STAR_ALPHA_THR && delta_theta <= STAR_THETA_THR )
                                left_up_cost = CostArray[LCS_MAX_RAY_SIZE*(index3 - 1) + index4 - 1]   +
                                               TRUE_WEIGHT - delta_dist/STAR_DIST_COEFF - delta_alpha/STAR_ALPHA_COEFF - delta_theta/STAR_THETA_COEFF;
                            else
                                left_up_cost = CostArray[LCS_MAX_RAY_SIZE*(index3 - 1) + index4 - 1] + FALSE_WEIGHT;
                        }

                        if ( left_up_cost > up_cost && left_up_cost > left_cost )
                        {
                            CostArray[LCS_MAX_RAY_SIZE*index3 + index4] = left_up_cost;
                            DirArray[LCS_MAX_RAY_SIZE*index3 + index4]  = LEFT_TOP_DIR;
                        }
                        else
                        {
                            if ( up_cost > left_cost && up_cost > left_up_cost )
                            {
                                CostArray[LCS_MAX_RAY_SIZE*index3 + index4] = up_cost;
                                DirArray[LCS_MAX_RAY_SIZE*index3 + index4]  = UP_DIR;
                            }
                            else
                            {
                                CostArray[LCS_MAX_RAY_SIZE*index3 + index4] = left_cost;
                                DirArray[LCS_MAX_RAY_SIZE*index3 + index4]  = LEFT_DIR;
                            }
                         }
                     }
                 }
 
                 // PRINT-LCS
                 --index3;
                 --index4;

                 while ( index3 > 0 && index4 > 0)
                 {
                     if ( LEFT_TOP_DIR == DirArray[LCS_MAX_RAY_SIZE*index3 + index4] )
                     {
                         // DFS algorithm part II
                         if (WHITE_COLOR == LeftColorArray[pTemplateMinutiae->rays[index3 - 1].neighbour_minutiae] &&
                             WHITE_COLOR == RightColorArray[pInputMinutiae->rays[index4 - 1].neighbour_minutiae] )
                         {
                             ++top_of_stack;
                             LeftStack[top_of_stack] = pTemplateMinutiae->rays[index3 - 1].neighbour_minutiae;
                             RightStack[top_of_stack]    = pInputMinutiae->rays[index4 - 1].neighbour_minutiae;
                             ++cur_score;
                         }

                         LeftColorArray[pTemplateMinutiae ->rays[index3 - 1].neighbour_minutiae] = GRAY_COLOR;
                         RightColorArray[pInputMinutiae ->rays[index4 - 1].neighbour_minutiae]   = GRAY_COLOR;

                         --index3;
                         --index4;
                         continue;
                     }

                     if ( UP_DIR == DirArray[LCS_MAX_RAY_SIZE*index3 + index4] )
                     {
                         --index3;
                         continue;
                     }

                     if ( LEFT_DIR == DirArray[LCS_MAX_RAY_SIZE*index3 + index4] )
                     {
                         --index4;
                         continue;
                     }
                 }

                 // DFS algorithm part III
                 LeftColorArray[cur_template_root_minutiae] = BLACK_COLOR;
                 RightColorArray[cur_input_root_minutiae]   = BLACK_COLOR;
            }
            // end of DFS algorithm

            if ( cur_score > max_score )
                max_score = cur_score;
        }
     }

     // calculate matching score
     if ( max_score > MIN_MINUTIAE_PAIR_THR )
         return (double)(max_score*max_score)/(pLeftFV->minutiae_amount*pRightFV->minutiae_amount);
     else
         return 0.0;
}

Literature

1. D. Maio, D. Maltoni, A. K. Jain, and J. Feng. Handbook of Fingerprint Recognition. Springer Verlag, 2020.

2. Fingerprint Verification Competition. http://bias.csr.unibo.it/fvc2000/download.asp; http://bias.csr.unibo.it/fvc2002/download.asp

3. K. Ko, W. J. Salamon. NIST Biometric Image Software (NBIS). https://www.nist.gov/services-resources/software/nist-biometric-image-software-nbis – 2015.

4. https://nigos.nist.gov/nist/nbis/nbis_v5_0_0.zip

5. S. Chikkerur, A. N. Cartwright, and V. Govindaraju. K-plet and Coupled BFS: A Graph Based Fingerprint Representation and Matching Algorithm. In: Advances in Biometrics. ICB 2006. Lecture Notes in Computer Science, vol 3832. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11608288_42 – 2015.

6. Skiena, S.S. Algorithms. Development Guide. – 3rd ed.: Trans. from English. – St. Petersburg: BHV-Petersburg, 2022.

7. Cormen, T., Leiserson, C., Rivest, R. Algorithms: construction and analysis. Moscow: MCNO, 2000.

8. Implementing a Depth-First Search Algorithm. Translated by Ksenia Moseenkova (kmoseenk). Bl

og company OTUS. https://habr.com/ru/companies/otus/articles/660725/ – 2022.

9. Finding the maximum common subsequence. Author: hamst. https://habr.com/ru/articles/142825/ – 2012.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *