Why your PII never touches our servers.
A technical walkthrough of the Secure Vault — the hardware-encrypted on-device store that holds every Clikkin member's identity, keys, wallet, and badges — and the query/response protocol that lets spaces ask narrow questions without ever seeing the underlying data.
The protocol in one diagram: the server asks a narrow question, the vault answers true or false. The underlying DOB never leaves the device.
Most "privacy-first" platforms are privacy-adjacent. They collect your data, store it, call the storage "encrypted at rest" (it is — with keys they hold), and then write a policy that says "we take your privacy seriously."
Clikkin took a different approach. We asked a dumber, harder question: what if we just don't have the data?
The problem with "encrypted at rest"
When a platform holds your data — even encrypted — it holds a latent liability. A rogue employee can access it. A subpoena can compel it. A breach can leak it. A policy change three years from now can expose it. The platform has the keys. You don't. The data's fate is not yours to decide.
So we built a different protocol.
The platform does not hold the data. The platform asks the device a question. The device answers.
Enter the Secure Vault
The Secure Vault is a hardware-encrypted on-device store. On iOS, encryption keys live in the Secure Enclave — a dedicated processor isolated from the main CPU, with its own secure boot chain and no software access to its private keys. On Android, they live in StrongBox, the equivalent tamper-resistant hardware module on devices that support it.
Inside the vault:
- Identity. Name, date of birth, country, document hashes, face embedding. PII from Clikkin Passport verification.
- Keys. The master key that derives per-space encryption keys. Admin keys for any space you administrate.
- Wallet. Coin balance, transaction history, cash-wallet token (Pro Plus+).
- Badges. Every badge you hold, the role graph, the Consumer Badge Vault that logs every query.
Everything in that list is encrypted with AES-256-GCM under keys that never leave the secure hardware. The plaintext of your DOB exists in exactly two places: in the original government document you scanned, and for one brief moment in RAM when the vault is answering a query. It never exists on our servers. It has no latency path to our servers. There is no feature flag we could toggle to start collecting it — the ingestion pipeline doesn't exist.
The query / response protocol
So how does anything work? Spaces need to know things. A 21+ space needs to know if you're of age. A seller-enabled space needs to know if you've completed KYB. A paid-events space needs to know if your wallet can cover the ticket.
They ask the vault.
// Space-side: "is this member old enough?"
const answer = await vault.query({
predicate: "age_gte",
arg: 21,
reason: "space_age_policy",
consent_scope: "one_shot"
});
// answer === true | false
Three things matter here:
- The space specifies a predicate, not a field. It can ask
age_gte(21)— not "give me the date of birth." The vault only responds to a whitelist of questions. - The response is a boolean or a minimal scalar. Never a document. Never an embedding. Never the raw DOB.
- The consent scope is explicit.
one_shotmeans the answer is used exactly once and not cached.sessionmeans cached for one session.persistentrequires a distinct consent UI the first time and can be revoked from the member's settings at any tap.
What the server actually sees
When a space queries the vault, the Clikkin server relays the request and the boolean reply. The request includes an opaque query ID, a space ID, a predicate name, and a zero-knowledge commitment to the argument (so the server can't inspect "what value was being checked" across queries). The reply is a signed boolean plus a timestamp.
A full server dump during a query, at any moment in time, would yield: a stream of (space_id, predicate_name, boolean). Not one PII field. Not one identity document. Not one face embedding.
The Consumer Badge Vault: every query, logged
Here's the transparency half of the pair. Every query a space makes is logged — on the member's device, in a dedicated log inside the vault called the Consumer Badge Vault (CBV). The log is append-only, signed by the Secure Enclave, and surfaces as a notification the space cannot suppress.
"Maker Garage asked the vault: are you 18+? Yes." That notification lands. The member can tap into the log and see every query any space has ever made, when they made it, what consent scope, what the answer was.
Spaces cannot hide behind automation. Every auto-query carries the invoking space's identity. The human moderator whose click triggered a moderation query is attributed by name.
"But what about private posts?"
Good question. Private post classes — the ones where a space admin restricts visibility to a sub-group of members — can't live only in query-land. The post content itself has to be stored somewhere, and the members who are allowed to read it need to be able to.
Our answer: per-space symmetric keys, held on the space's admin devices, distributed to entitled members via the standard end-to-end envelope protocol. Posts are encrypted client-side before upload. Our servers store ciphertext. We have no keys. Another full-server dump yields: metadata (post ID, author space, author hash, timestamp) and a blob of AEAD ciphertext.
A subpoena for decryption of private-class content fails at the technical level. We do not have the keys. We cannot derive them. We cannot be compelled to produce what we do not have.
What we can be compelled to produce
To be honest about the whole picture: there are things we do hold. We hold:
- The public posts on public-class spaces, which are public by definition.
- The hashed email address you signed up with (for account recovery).
- A boolean indicating whether your account is Passport-verified.
- A few HMACs that let us verify a member is who they claim without learning who they are.
- The metadata above (space IDs, timestamps, post IDs).
Those are the honest bounds. If you receive a government data request we fulfilled, this is the maximum surface area of what could have been produced.
Why the hardware matters
A principled design for the query protocol could be defeated by a software-only keystore. If the keys were stored in the OS keychain — even encrypted — a privileged process or a malicious OS update could exfiltrate them. The Secure Enclave and StrongBox close that door. Keys enter the secure hardware once (during vault provisioning), are generated there, and never come out. Queries are evaluated inside the hardware; only the boolean leaves.
This does mean Clikkin requires a device with the supported hardware. iPhone 8 and newer. Android devices with StrongBox (common on flagships 2020+, spottier on budget devices). We're not shy about this tradeoff: we'd rather support fewer devices well than pretend our privacy model holds on hardware it doesn't.
The principle, restated
Our first principle says: "PII belongs to the member, not the platform." The Secure Vault is what that principle becomes in code. Privacy isn't a policy we promise to honor. It's an architecture that makes the promise redundant.
You don't trust us. You trust the math, the hardware, and the fact that we literally cannot betray you without first fabricating encryption keys we don't possess.
Anya Petrova is Clikkin's platform architect. She writes about the bits of the system where the principles become physics.
Tagged