The selection of the appropriate biometric technology will depend on a number of application-specific factors, including the environment in which the identity verification process is carried out, the user profile, requirements for verification accuracy and throughput, the overall system cost and capabilities, and cultural issues that could affect user acceptance.
The Figure below shows a comparison of different biometric technologies, with their performance rated against several metrics.
A key factor in the selection of the appropriate biometric technology is its accuracy. When the live biometric template is compared to the stored biometric template, a matching score is used to confirm or deny the identity of the user. System designers set this numeric score to accommodate the desired level of accuracy for the system, as measured by the false acceptance rate (FAR) and false rejection rate (FRR). The false acceptance rate indicates the likelihood that a biometric system will incorrectly identify an individual or accept an impostor. The false rejection rate indicates the likelihood that a biometric system will reject the correct person. Biometric system administrators will tune system sensitivity to FAR and FRR to get to the desired level of accuracy supporting the system security requirements (eg, for a high security environment, tuning to achieve a low FAR and tolerating a higher FRR).
Source: 'A Practical Guide to Biometric Security Technology', IT Professional, January/February 2001.2
© Technews Publishing (Pty) Ltd. | All Rights Reserved.