Fingerprint Verification Competition

The Fingerprint Verification Competition (FVC) is an international competition focused on fingerprint verification software assessment. A subset of fingerprint impressions acquired with various sensors was provided to registered participants, to allow them to adjust the parameters of their algorithms. Participants were requested to provide enroll and match executable files of their algorithms; the evaluation was conducted at the organizers’ facilities using the submitted executable files on a sequestered database, acquired with the same sensors as the training set.

The organizers of FVC are:

Each participant can submit up to one algorithm to the open and light categories.

The first, second and third international competitions on fingerprint verification (FVC2000, FVC2002 and FVC2004) were organized in 2000, 2002 and 2004, respectively. These events received great attention both from academic and industrial biometric communities. They established a common benchmark, allowing developers to unambiguously compare their algorithms, and provided an overview of the state-of-the-art in fingerprint recognition. Based on the response of the biometrics community, FVC2000, FVC2002 and FVC2004 were undoubtedly successful initiatives. The interest shown in previous editions by the biometrics research community has prompted the organizers to schedule a new competition for the year 2006.

In 2006 there were:

Aim

Categories

Databases

One of the most important and time-consuming tasks of any biometric system evaluation is the data collection. Organizers have created a multi-database, containing four disjoint fingerprint databases, each collected with a different sensor technology.

  • subsets DB1-A, DB2-A, DB3-A and DB4-A, which contain the first 140 fingers (1680 images) of DB1, DB2, DB3 and DB4, respectively, are used for the algorithm performance evaluation.
  • subsets DB1-B, DB2-B, DB3-B and DB4-B, containing the last 10 fingers (120 images) of DB1, DB2, DB3 and DB4, respectively, will be made available to the participants as a development set to allow parameter tuning before the submission.

Performance evaluation

For each database and for each algorithm:

     ((12*11) /2) * 140 = 9,240 
     ((140*139) /2) = 9,730 

Although it is possible to reject images in enrollment, this is strongly discouraged. In fact, in FVC2006, as in FVC2004 and FVC2002, rejection in enrollment is fused with other error rates for the final ranking; in particular, each rejection in enrollment will produce a "ghost" template which will not match (matching score zero) with all the remaining fingerprints.

For each algorithm and for each database, the following performance indicators are reported:

The following average performance indicators are reported over the four databases:

Participants

See also

External links

This article is issued from Wikipedia - version of the 8/4/2015. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.