Blog

Face Matching Without the Privacy Risk: How Biometric Ephemerality Works

Biometric face matching doesn't have to mean surveillance. Ephemerality — process in RAM, discard in under a second — changes everything.

Diagram showing biometric data processed in RAM and discarded after verification

Face Matching Without the Privacy Risk: How Biometric Ephemerality Works

Face matching in visitor management has a perception problem — and it’s deserved. The word “facial recognition” evokes surveillance cameras cataloging faces, databases accumulating biometric templates, and the privacy nightmares that follow when those databases are breached, sold, or subpoenaed.

But face matching and facial recognition surveillance are architecturally different operations. The privacy risk isn’t in the matching — it’s in the storage.

Two kinds of face matching

Surveillance matching (1:N) compares a captured face against a database of faces to identify an unknown person. “Who is this?” This is what cameras in public spaces do. It requires maintaining a database of biometric templates — and that database is the privacy liability. It’s what BIPA, GDPR, and Texas CUBI are designed to regulate.

Verification matching (1:1) compares a captured face against a single claimed identity to confirm the person is who they say they are. “Is this person the person whose digital ID they just presented?” This is what passport control does manually today — the officer looks at the photo on the passport and looks at the person. The only difference is that the comparison is automated and cryptographic instead of visual and subjective.

KeyShare’s visitor management uses 1:1 verification matching — not 1:N surveillance. The visitor presents their digital ID (which includes a photo). The system captures a live image and compares it against the single photo from the verified credential. The question is narrow and specific: “Does this face match the photo on this digital ID?”

The ephemerality architecture

The privacy architecture of the 1:1 face match is defined by what happens to the biometric data after the comparison.

In a non-ephemeral system, the face template (or the raw image) is stored — in a database, on a server, in the cloud. That stored data becomes a liability: subject to breach, subpoena, FOIA requests, and regulatory action.

In an ephemeral system, the biometric data is processed in RAM and discarded before it ever reaches persistent storage. The face image is captured, compared against the credential photo, and deleted — all within a single second. No template is generated. No image is saved. No biometric database exists.

The KeyShare Visitor Experience Platform implements biometric ephemerality:

  • Face data is processed on the local VEP instance (VEP Local) — never transmitted to the cloud, never sent to a third-party service.
  • The comparison happens in volatile memory (RAM). The image is deleted from RAM upon comparison completion — designed to occur in under one second.
  • No biometric template is generated. No face embedding is stored. No image is written to disk.
  • The audit trail records that a face match occurred (timestamp, result: match/no-match, confidence score) — but the audit trail does not contain the face image or any biometric data.

After the one-second comparison, the system holds exactly what a human receptionist would remember: “I checked this person’s ID and they looked like their photo.” The biometric data itself is gone.

The regulatory landscape

Biometric privacy regulation is expanding rapidly. The relevant frameworks:

BIPA (Illinois). The Biometric Information Privacy Act requires written consent before collecting biometric identifiers, prohibits sale of biometric data, mandates a retention schedule, and provides a private right of action with statutory damages ($1,000–$5,000 per violation). BIPA is the most aggressive biometric privacy law in the US.

GDPR (EU). Under GDPR, biometric data processed for identification is “special category” data requiring explicit consent (Article 9(2)(a)). However, 1:1 verification matching (confirming a claimed identity) may fall under a different legal basis than 1:N identification matching. The legal analysis is nuanced — consult with your DPO.

Texas CUBI. The Capture or Use of Biometric Identifier Act prohibits capture of biometric identifiers for commercial purposes without consent. Enforcement by the Texas Attorney General.

CCPA/CPRA (California). Biometric information is included in the definition of personal information. Consumers have the right to know what biometric data is collected and to request deletion.

Ephemeral biometric processing addresses the core concern of all four frameworks: data is not collected, stored, or retained. The face image exists in volatile memory for under one second and is never written to persistent storage. There is no biometric database to breach, sell, or disclose.

This doesn’t eliminate regulatory obligations — consent requirements still apply in most jurisdictions. But it fundamentally changes the risk profile. The regulatory liability of processing biometric data in RAM for one second and discarding it is categorically different from maintaining a database of biometric templates.

The practical deployment

For buildings deploying visitor management with face matching:

Consent. The visitor is informed that a face match will be performed before the match occurs. The consent is captured electronically as part of the Puck interaction — timestamped, logged, and auditable. Visitors can decline the face match; the system proceeds with identity verification only (no face comparison).

On-premise processing. All face matching runs on VEP Local — a local compute instance at the building. No biometric data leaves the customer’s network. No cloud processing. No third-party API calls. This is a deployment constraint that matters for customers in regulated industries (financial services, healthcare, government).

Configurable. Face matching is optional per site. Buildings can enable identity verification (digital ID check) without face matching. Face matching is an additional layer for sites that require higher assurance — corporate headquarters, data centers, facilities with classified areas.

The same privacy-first architecture applies to the broader visitor workflow — from the 15-minute badge problem to the zero-PII reader architecture that ensures no identity data persists at the door edge.

See the full visitor management solution →

Share this article
Simon Forster
Written by Simon Forster

Simon Forster is the CTO of KeyShare. He contributes to identity and access standards through the NFID Foundation.