Biometric system manufacturers claim high system performance which is practically difficult to achieve in actual operating environments. The possible reasons are, tests conducted in controlled environment setups, limitations on hardware, etc.
For example, a voice recognition system can work efficiently only in quiet environment, a facial recognition system can work fine if lighting conditions are controlled, and candidates can be trained to clean and place their fingers properly on the fingerprint scanners.
However, in practice, such ideal conditions may not be available in the target operating environment.
The performance measurements of a biometric system are closely tied to False Reject Rate (FRR) and False Accept Rate (FAR).
FRR is also known as Type-I error or False Non Match Rate (FNMR) which states the likelihood of a legitimate user being rejected by the system.
FAR is referred to as Type-II error or False Match Rate (FMR) which states the likelihood of a false identity claim being accepted by the system.
An ideal biometric system is expected to produce zero value for both FAR and FRR. Means it should accept all genuine users and reject all fake identity claims, which is practically not achievable.
FAR and FRR are inversely proportional to each other. If FAR is improved, then the FRR declines. A biometric system providing high FRR ensures high security. If the FRR is too high, then the system requires to enter the live sample a number of times, which makes it less efficient.
The performance of current biometrics technologies is far from the ideal. Hence the system developers need to keep a good balance between these two factors depending on the security requirements.