Biometrics and Derived Credentials


This is Part 4 of a series discussing the
public
comments
on Draft NIST SP 800-157,
Guidelines
for Derived Personal Identity Verification (PIV) Credentials
and the
final
version
of the publication. Links to all the posts in the series can be found
here.

As reviewed in
Part 3, a PIV card
carries two fingerprint templates for off-card comparison, and may
also carry one or two additional fingerprint templates for on-card
comparison, one or two iris images, and an electronic facial image.
These biometrics may be used in a variety of ways, by themselves or in
combination with cryptographic credentials, for authentication to a
Physical Access Control System (PACS) or a local workstation. The
fingerprint templates for on-card comparison can also be used to
activate private keys used for authentication, email signing, and
email decryption.

By contrast, neither the
draft version
nor the
final
version
of SP 800-157 consider the use of any biometrics analogous
to those carried in a PIV card for activation or authentication.
Actually, they “implicitly forbid” the storage of such biometrics by
the Derived PIV Application that manages the Derived PIV
Credential, according to NIST’s response to comment 30 by Precise
Biometrics.

But several comments requested or suggested the use of biometrics by
the Derived PIV Application. In this post I review those comments,
and other comments expressing concern for biometric privacy. Then I
draw attention to privacy-preserving biometric techniques that should
be considered for possible use in activating derived credentials.

Comments requesting the use of biometrics

Comments 13 by Oberthur Technologies, 233 by Apple and 397 by
CertiPath requested the option of using biometrics for private key
activation. NIST responded that additional activation methods will be
considered in the next version of the document.
Comments 27 and 29 by Precise Biometrics suggested the use of
biometrics for remote issuance of derived credentials, and for remote
reset of the activation PIN or password.

Comment 26 by Precise Biometrics and NIST’s response are hard to
understand, but deserve to be explained.

The comment refers to
FIPS
201-2
which “specifies different authentication mechanisms that
can be used to fulfill LOA 4
” and “can be used together as
multiple authentication factors to achieve even higher authentication
confidence at LOA 4
“. This no doubt refers to Table 6-1, Table
6-2, Footnote 35, and Table 6-3 of FIPS 201-2. Table 6-1 renames the
four levels of assurance (LOA) of Section 2.1 of
OMB
memorandum M-04-04
. Table 6-2 allows the use of the
authentication mechanisms called BIO-A, OCC-AUTH and PKI-AUTH, which I
explained in
Part 3, for
physical access control at LOA 4. Footnote 35 refers to
SP
800-116
, which, as I also explained in Part 3, discusses
combinations of biometric and cryptographic authentication mechanisms
for authentication to a PACS. Table 6-3 is concerned with
authentication for logical access“, i.e. with authentication
to obtain access to an information system rather than to a physical
facility; it allows the use of BIO-A, OCC-AUTH and PKI-AUTH for
authentication to a local workstation, i.e. a computer used by the
cardholder, but not to a remote server. In the “Suggested
Change
” column, Precise Biometrics seems to be suggesting that,
although SP 800-157 is not concerned with the biometric authentication
mechanisms, nor with combinations of authentication mechanisms, it
should leave the door open for other documents to do so.

NIST’s response declined the suggested change arguing that SP 800-157
is aligned with FIPS 201-2 and SP 800-63. SP 800-63, the
Electronic
Authentication Guideline
, is concerned with “remote authentication
of users (such as employees, contractors, or private individuals)
interacting with government IT systems over open networks
“.

Regarding
FIPS
201-2
, NIST’s response says that Table 6-3 allows only PKI-AUTH
(authentication with the PIV Authentication private key and
certificate) for “remote access control“, i.e. for logical
access to a remote information system. It is left unsaid
that SP 800-157 assigns derived credentials an extremely narrow scope,
ruling out their use for authentication to a local workstation or
physical access control. As reviewed in
Part 1,
this narrow scope was protested in many comments.

SP 800-63 is quoted as saying that biometric authentication uses
information that is private rather than secret“, which makes
it unsuitable for remote access control according to SP 800-63. But
the reason why it is unsuitable for remote access control but not for local authentication
is that

In the local authentication case, where the Claimant is observed by an
attendant and uses a capture device controlled by the Verifier,
authentication does not require that biometrics be kept secret.

(SP 800-63, page 2, lines 2-5). So the real issue is not whether
authentication is local or remote, but whether it is attended or
unattended. As I explained in
Part 3, in
agreement with the above quote, when there is no assurance of
liveness, security depends on the relative secrecy of the biometric,
and more precisely on whether an adversary has access to a biometric
sample. If biometric authentication is deemed unsuitable for remote
authentication because it is unattended, it should also be deemed unsuitable
for the unattended biometric authentication methods BIO and OCC-AUTH
of FIPS 201-2 that I described in Part 3, and maybe even for
credential activation, which is also unattended.

I actually agree that biometric authentication, if used by itself
rather than as a means of activating a cryptographic authentication
credential, is unsuitable for remote access control, but for a
different reason. Such authentication would require the person who
wishes to authenticate to surrender his or her biometric privacy by
submitting a biometric sample to that system, exposing it to abuse or
compromise. Possible consequences of biometric abuse or compromise
were briefly discussed in
Part 3.

Comments expressing concern about biometric privacy

Biometric authentication is too often used in the private sector with
complete disregard for the privacy implications that I discussed in
Part 3. So I was
happy to see that some comments expressed concern for biometric
privacy.

SP 800-157 does not call for storing any biometrics in a mobile device
that carries derived credentials, but requires the use of a biometric
sample taken from the user for in-person credential issuance at LOA 4,
and for in-person reset of the activation PIN or password. The
draft
version
required the sample used at issuance to be retained for
future reference and to be used for in-person reset. Comment 171 by
the Department of State, and duplicate comment 337 by the Federal
Public Key Infrasctructure Certificate Policy Working Group (FPKI
CPWG), asked for a reference to the Privacy Act on the need to protect
the retained sample. Similarly, comment 284 by Sam Wilke asked for a
reference to authority on retaining biometric information. Comment
243 questioned the need to collect the sample, since “In cases
where the same agency issues the PIV card and the derived credential,
we would already be in possession of the biometric template.

(Actually, SP800-157 allows the issuer of the PIV card and the issuer
of derived credential to be different, but several comments objected
to this. I reviewed those comments in
Part
1
, and argued that NIST is allowing the issuers to be different
for reasons that are not applicable.)

In response to the comments, NIST added a footnote stating that
The retained biometric shall be protected in a manner that
protects the individual’s privacy
“, and allowed the option of
comparing the biometric sample used for in-person reset against a
stored biometric on the PIV Card or biometric data stored in the
“chain-of-trust” (i.e. collected for use in a background check before
issuance of the PIV card) instead of comparing it against the retained
sample. But it declined to remove the request to retain a sample,
arguing that “The requirement is derived from the common policy and
it provides an audit trail for dispute resolution
“.

Privacy-preserving biometrics

The use of a biometric sample to activate a cryptographic credential
that can in turn be used for authentication to a remote system avoids
the privacy risk inherent in sending the sample to the remote system.
But other risks remain. In particular, if the sample is compared to a
biometric template that is stored in a mobile device without tamper
protection, the template can be obtained by an adversary who
captures the device. Biometric information in the template can then
be used to construct a biometric sample that matches the template and
can be used to impersonate the user.

Perhaps the best way to avoid biometric privacy risks is to not store
biometrics in mobile devices. So I’m actually sympathetic to the
stance taken by SP 800-157 of not allowing any biometrics to be
carried in mobile devices along with derived credentials. But this is
not a realistic stance. PIV cards use biometrics to perform a variety
of functions, as we saw
in Part 3, and
there is a desire to be able to perform the same functions with a
mobile device, strongly expressed in the comments. Also, the
inclusion of software tokens among the options for implementing
derived credentials in SP 800-157, in spite of the security challenges that I
discussed in
Part 2,
shows that there is a desire to not be constrained by the
unavailability of physical tamper resistance.

So NIST should look for ways of using biometrics without compromising
privacy. There is a class of privacy-preserving biometric
authentication techniques can make that possible. These techniques
use the following general approach. An enrollment biometric sample is
used to compute helper data, which is later combined with an
authentication sample to produce a biometric key. The same
biometric key is produced consistently by varying but genuine samples
with a high probability, whose complement is akin to the false
rejection rate (FRR) of ordinary biometric authentication. A
different key is produced by a non-genuine sample with a high
probability, whose complement is similarly akin to the false
acceptance rate (FAR) of ordinary biometric authentication.

For a technique that follows this approach to be privacy-preserving,
it must be deemed unfeasible to derive significant biometric
information that might be useful to an adversary from the helper data.
This is the case, for example, in a technique originally proposed by Juels and
Wattenberg in their paper entitled
A Fuzzy Commitment
Scheme
, and later successfully applied to authentication with an
iris image by Hao, Anderson and Daugman in their paper Combining
cryptography with biometrics effectively
(available as a
technical
report
). In that technique, the helper data is the exclusive-or
of a random codeword of an error-correction system and a binary
feature vector derived from a biometric sample. The random codeword
is not a random string, since it contains redundancy. But it is not
known how to derive significant biometric information from the fact
that the helper data is the exclusive-or of a feature vector with a
string that contains such redundancy.

In the technique of Juels et al. and Hao et al., the
biometric key is generated at random at enrollment time, and expanded
into the random codeword by adding redundancy. At authentication
time, the random codeword is recovered by x-oring the helper data with
a feature vector derived from the authentication sample, which
produces a string that differs from the codeword where the enrollment
and authentication feature vectors differ, and applying error
correction to the string. The biometric key is then obtained by
removing the redundancy from the codeword.

While this technique is well suited to biometric modalities where a
sample can be readily translated into a binary feature vector, other
privacy-preserving biometric techniques may be better suited to other
modalities. For example, the
Fuzzy
Vault
technique of Juels and Sudan may be better suited for
modalities where a sample is more readily translated into a set of
feature points rather than a binary feature vector.

Surveys of privacy-preserving biometric techniques are available
here
and
here.

Using privacy-preserving biometric techniques for credential activation

The simplest way to use privacy-preserving biometric techniques to
activate one or more credentials stored in a mobile device is to store
the helper data and a cryptographic hash of the biometric key in the
device. An activation sample entered by the user is then combined
with the helper data to derive a biometric key. A hash of the derived
key is computed and compared to the stored hash. If they are equal,
the user is allowed to use the credentials.

This simple method protects the privacy of the biometric against an
adversary who captures the mobile device and extracts the helper data
and the hash of the biometric key. But it does not protect the
credentials themselves, which can be extracted by the adversary if
there is no physical tamper protection.

A more elaborate method is to store the helper data and encrypt the
credentials under the biometric key. This would protect the
credentials if the biometric key had sufficient entropy. But
unfortunately it does not. In their paper referenced above, Hao et
al. estimate that their biometric key has 44 bits of entropy. That’s
much more than the 20 bits of entropy of a 6-digit PIN that the
final
version of SP 800-157
allows for activation of credentials in a
software token unprotected by physical tamper resistance (cf. first
paragraph of Section 3.4 and Footnote 12). But it is not nearly
enough against an adversary who extracts the encrypted keys and mounts
a brute force attack against the biometric key using a botnet.

An effective method is to encrypt the credentials under a key that is
entrusted to a key storage service in the cloud, and retrieved from
the cloud using a device-authentication credential to authenticate to
the service. Because it is only used for that particular purpose, the
device-authentication credential can be derived on-demand from a
protocredential stored in the device and a secret not stored in the
device, in a way that deprives an adversary who captures the device
and obtains the protocredential of any information usable for mounting
an offline guessing attack against the non-stored secret (or any
parameter derived from the non-stored secret).

In
our comments
on the draft version of SP 800-157 we sketched a protocredential-based
activation technique that uses the activation passcode (PIN or
password) as the non-stored secret. The protocredential consists of
the domain parameters of a DSA key pair, a salt, and a device record
handle referring to a record in the key storage service. The
device-authentication credential derived from the passcode and the
protocredential consists of the DSA key pair and the device record
handle.

That technique can be easily modified to use a biometric key in lieu
of the passcode, as described in our technical report entitled
A
Comprehensive Approach to Cryptographic and Biometric Authentication
from a Mobile Perspective
. In the modified technique, the
non-stored (relative) secret is the biometric sample supplied by the
user, the helper data is part of the protocredential, and the DSA key
pair is regenerated using the biometric key derived from the helper
data and the biometric sample instead of the passcode.

We have recently found an alternative way of deriving a
device-authentication credential from a protocredential and a
non-stored secret, where the non-stored secret may also be either a
passcode or a biometric sample.

If the non-stored secret is a passcode, the protocredential consists
of a device record handle and an uncertified key pair, and the
credential is derived simply by adding the passcode as an additional
component. To authenticate to the key storage service, the device
sends the passcode, the public key component of the key pair, and a
signature on a challenge computed with the private key. The service
verifies the signature, computes a joint hash of the passcode
and the public key, and compares the computed joint hash against a
stored joint hash. An adversary who captures the device may obtain
the key pair, but can only test guesses of the passcode by attempting
to authenticate online to the service, which limits the number of
attempts. An adversary who breaches the security of the key storage
service cannot mount an offline guessing attack against the PIN using
the joint hash without knowing the public key.

If the non-stored secret is a biometric sample, usable in conjunction
with helper data to derive a biometric key, the protocredential
consists of a device record handle, an uncertified key pair, and the
biometric helper data. The device-authentication credential consists
of the device record handle, the uncertified key pair, and the
biometric key derived from the helper data and the biometric sample.
To authenticate to the key storage service, the device sends the
biometric key, the public key component of the key pair, and a
signature on a challenge computed with the private key. The service
verifies the signature, computes a joint hash of the biometric
key and the public key, and compares the computed joint hash against a
stored joint hash. The biometric key is protected against an offline
guessing attack by an adversary who captures the device or breaches
the security of the key storage service.


See also:

Comments

2 responses to “Biometrics and Derived Credentials”

  1. Dr. D Shila Avatar
    Dr. D Shila

    Interesting article! What is the state of privacy preserving biometrics. Nowadays we are using biometrics at phones to gate doors, challenging privacy. What if someone compromises? Is there any solution that tackles this problem.

    1. Francisco Corella Avatar

      Sorry for taking so long to approve your comment. Somehow I never received a notification that it had been placed in the moderation queue.
      Unfortunately privacy-preserving biometric techniques, also known by names including “revocable biometrics”, “biometric cryptosystems”, “biometric template protection”, etc., have been rarely used in products so far. A company called EyeVerify had a product called EyePrint ID that seemed to use such techniques for a biometric modality that combined eye vasculature with “micro features found in the tear duct and below the lower eyelid”, according to a white paper that can no longer be found online. EyePrint ID may have been based on technology described in US Patent 9,495,588. But after the company was acquired by Ant Financial and renamed Zoloz, its web site now emphasizes spoof detection and makes no reference to privacy-preserving biometric techniques.
      In the laboratory, revocable biometrics have been most successful for iris verification, and some of today’s phones have the infrared cameras that are required for that modality. But the infrared cameras in phones that have them are not accessible to developers, and I don’t know of any efforts by phone manufacturers to implement revocable biometrics.