Modern biometric technology began around the 1960s, as scientists started identifying the physiological aspects of acoustic speech and phonic sounds. In 1969, the Federal Bureau of Investigation (FBI) pushed for fingerprint identification that led to the study of minutiae points that helped map unique patterns and ridges of fingerprints.
In the 1990s, biometric science took off as the Department of Defense (DoD) funded face recognition algorithms for commercial markets in partnership with the Defense Advanced Research Products Agency (DARPA). Lockheed Martin also purchased an automated fingerprint identification system to aid the FBI.
Around the 2000s, West Virginia University established the first Bachelor’s degree program in Biometric Systems Engineering and Computer Engineering. Additionally, International Organization for Standardization (ISO), which promoted international collaboration in biometric research, also helped standardize generic biometric technologies.
The biometric stage also featured the palm print biomarker tech. To address fragmentation and market adoption barriers in biometric technology, the European Biometric Forum was established. Face recognition was accepted as a biometric authentication method for passports and other Machine-Readable Travel Documents.
The United States immigration department also utilized biometrics to facilitate visa applications for legitimate travelers. It was done to increase security and allow millions of legal travelers to travel to the United States. Biometric data such as fingerprints, DNA swabs, voice samples, iris images, and voice samples were used to track and identify national security threats.
The shift to smartphones also made its mark in the history of biometric technology as Apple introduced Touch ID by Apple to the iPhone 5S in 2013, which made biometrics more accessible for everyday use. Touch ID is a key feature in iOS phones and other devices that enables users to unlock and make purchases through fingerprint authentication.