The use of biometrics to authenticate identity has been the subject of great debate for years, with opposing sides arguing its value for security, privacy, and convenience.
On one side, there is an absolute: the demand for infallible security beyond notoriously weak passwords or basic access cards to protect priceless intellectual property and safeguard people and assets.
While many would agree that biometric access authentication is the best way to conclusively verify that the person entering a secured area or has access to privileged information is who they say they are, not all biometric security solutions are equally trustworthy. That is because of what is known as “spoofability,” or their susceptibility of being tricked by various means.
Biometric systems that have a high degree of spoofability include fingerprint, image-based facial recognition, and some iris systems. Fingerprints can be duplicated from one-dimensional photos – whether through 3-D printing or replication of the fingerprint through molds. Imaged-based facial recognition as well as some iris systems often fail as they focus on simple matching techniques – looking to validate an identity through image-based techniques. This means a high-quality photo or even an identical twin with matching external features can spoof a system.
These vulnerabilities have given the biometric security market a black eye and feeds the fervor of the other side of the great debate – the far more emotional and ethically-charged advocacy of identity privacy. This side points to the tremendous increase in surveillance systems utilizing facial recognition to scan crowds of people, matching photos against a database of suspected criminals, terrorists, and other intelligence targets. This widespread collection of images worries many because of the parallel surge of stolen personally identifiable information (PII), including biometric data. For example, nearly six million sets of fingerprints were stolen during a high-profile hack last year into the systems of the U.S. Office of Personnel Management (OPM), which manages the records of federal employees—including many with security clearances.
The truth is, people’s identity, privacy, and security are at risk every day. The sense of privacy we all enjoyed a decade or so ago has been replaced by the uneasy acknowledgement that we live in an “everything’s out there” world. Former Department of Homeland Security (DHS) Secretary Michael Chertoff recognized this when he recently stated “the most important asset that we have to protect as individuals and as part of our nations is the control of our identity—who we are, how we identify ourselves, whether other people are permitted to masquerade and pretend to be us, and thereby damage our livelihood, damage our assets, damage our reputation, damage our standing in our community.”
But how can we balance the increased need for security with the fundamental desire for privacy?
While the political and ethical debates rage on about this topic, the answer really is in the technology itself. There is a revolution afoot that leaves behind the antiquated approach of storing PII, and instead uses an innovative method that does not store any of a person’s scanned biometric data.
Near-Infrared (NIR) technology combined with advanced matching algorithms is the groundbreaking alternative to vulnerable image-based facial and iris recognition systems, and spoofable fingerprint systems. Solutions that use the advanced NIR-based algorithm scan thousands of points on and beneath a user’s face in seconds to create an encrypted digital reference file—not a recognizable photo—so the information is not visually or otherwise identifiable as the user. This technological advancement presents a new way of thinking about biometric security solutions that interlinks security with privacy.
Biometric solutions that do not store personal information but can still conclusively identify a person – all without the vulnerability of being used by someone else to imitate that person – is the safest option from an identity threat standpoint. An advanced NIR-based solution that stores no recognizable PII and is difficult to spoof not only calms concerns over both security and identity privacy, but also addresses another factor that can render a security system ineffective: inconvenience.
When security systems are inconvenient, the solution can actually become a barrier to security. For example, fingerprint readers can be invasive, requiring the user to physically touch devices, which can be unattractive in applications such as healthcare, with fears over the transmission of germs. Plus, natural body swelling, lotions, and dirt can all interfere with the system’s accuracy, potentially forcing users to repeatedly attempt the process.
A more invasive iris or retina scan may do more than fingerprints to achieve authentication but may not scan properly if the person is more than a few feet away. For an iris or retina to be properly scanned, the person's head must remain completely still. Eyelashes, lenses, and anything that would cause a reflection, could make a scan difficult.
It is human nature to resist complexity. When people begin acting on inconvenience in an access control setting, you can bet security will be compromised, for example, by leaving security doors propped open.
Since solutions that use advanced NIR-based algorithms are frictionless, meaning you don’t have to touch anything, fast in that it scans thousands of points in less than a second, and reliable with less than .0004 false acceptance rates to date, user interaction becomes intuitive, and in turn, seamless.
Certainly, your identity is your greatest asset. Within 50 years, technology will be in place to document and distinguish the identity of every individual on the planet. The technologists behind that progress bear a tremendous responsibility to balance the very real concerns for security and the vital requirements for privacy. But with today’s advanced technology, biometric security no longer dictates the sacrifice of privacy for the sake of security.