zkTLS is a new technology that empowers applications to prove data integrity and source identity for any content retrieved from websites using TLS, in short making HTTPS connections verifiable.
The most interesting use cases showing its power are those in which private credentials are used to retrieve data from a server (be it bank balances, emails, social media and so on), some claims are made about that private data and sharing the credentials to have a second party attest those claims is not possible. Here zkTLS comes to the rescue, enabling those attestations to be done by a second party without it getting the credentials or ever seeing the private data. This creates huge opportunities by bringing trust in data without the need of changing the existing infrastructure, thus creating new ways of bridging trusted Web2 data into the Web3 world.
The backbone of most (if not all) zkTLS implementations is TLSNotary, a project with a long history going back 10 years and recently rewritten with modern cryptography by a team backed by the Ethereum Foundation.
TLSNotary creates cryptographic proofs of authenticity for any data on the web, enabling privacy-preserving data provenance. It does that by changing the TLS connection from being a dialogue between a receiver (usually called the Prover) and a Server, to a three-party protocol. In this protocol, the Prover, together with a third party (called Verifier), use multi-party computing to act as the receiving endpoint in the communications with the Server. The Verifier is involved in the trusted connection setup, making sure that the source of the content is indeed the one claimed by verifying its certificates, as well as through the whole TLS session, generating commitments for the data in a multi-party computation and guaranteeing for the retrieval process, without seeing the actual data.
To make the verification process non-interactive and proofs portable, as well as take the burden of the heavy computations, a Notary can act as a general-purpose Verifier and be part of the MPC TLS. This Notary can sign attestations of the generated commitments, enabling offline verification and any number of checks done later by other verifiers.
It’s important that the TLS communication involves three parties and makes use of garbled circuits, oblivious transfer and other cryptographic primitives, such that the data exchange can happen only if the Prover and Verifier work together, while the Server sees no difference compared to a regular TLS connection. Another relevant point is that the traffic is encrypted and getting the decrypted content in this setup is possible for the Prover only while having the Verifier in the loop, and this happens without revealing the plaintext to the Verifier. This is the small detail of paramount importance: the Verifier does not see the data being retrieved unless the Prover chooses so, but it’s still able to certify the process, making zkTLS disruptive. Parts of the content can be redacted, allowing the Prover to selectively disclose data to the Verifier if desired. Then zero-knowledge proofs can be generated for the redacted parts, enabling the Prover to make statements about the data while deriving trust from the notary and the TLS protocol and fully maintain privacy.
Teams like Opacity, VLayer, Eternis, Pluto, DECO, PADO Labs, zkPass, clique.social, Gandalf Network or Reclaim Protocol are pushing the boundaries of what’s possible today by building zkTLS solutions with various approaches, all with trustability in mind, which is pretty much needed in a world where generating content with AI becomes cheap and trivial.
Building a permissioned network for everything credit related and trying to restart un(der)collateralized lending on healthy foundations has trust at its core. Accountable has a privacy-first approach, with deploys of our software done on premises or on people’s private cloud. It’s a decentralized protocol to be at the heart of crypto credit, while keeping all the sensitive data private, but we all know that this data needs to be trusted somehow.
We make no compromise on security, this being the reason why Accountable is not a SaaS model: borrowers and lenders, as well as other actors, own their data, we are not a party that stores it for them and we never see their private information. Because of this approach, we need to have means to trust the data our software works with and passes around in a peer-to-peer fashion. That’s why we use trusted connectors to retrieve balances from sources (be it custodians, CeFi exchanges, banks, on-chain), that’s why we run our node and connectors in secure enclaves using SGX, and that’s why we need zkTLS. It allows us, for example, to prove that certain balances were received from a source who’s indeed the one claimed to be, and that those balances were not tampered with. In addition to allowing us to run on untrusted hardware, it offers both integrity and source identity proofs, elevating trust in the reports Accountable generates to a very high level.
At Accountable, we view verifiability as a spectrum. We recognize that it is not possible to transition from mere claims to ZK proof-backed numbers overnight. Therefore, we aggregate data with varying levels of verifiability and report the results accordingly. Verifiability becomes a new risk dimension that people can monitor over time. Our goal is to increase verifiability by all means possible, ultimately achieving ZK-backed "truth." zkTLS is a crucial tool in reaching this objective.
Some might argue that the guarantees that can be obtain with zkTLS are enough. Why doesn’t Accountable exclusively use it? Well, there are several reasons for it.
First, there are concerns related to security rules that need to be enforced by entities running Accountable. The inclusion of a foreign node in the trusted connection setup phase is not always desirable. One way to mitigate this issue is by running the TLSNotary node on the lender side (case in which the verifier and the notary is the same) or by another trusted and regulated party. However, this approach also has its drawbacks, and some might argue that it would be even better if an external machine weren’t necessary at all.
Then there’s the protocol risk: what if there’s a vulnerability in the zkTLS implementation/TLSNotary and the data can be sniffed? It might be the case there is no vulnerability lying around and a future implementation will stand even a quantum attack, but protocol risk remains something that cannot be discarded and needs to be accounted for.
Another issue is that the trust obtained via zkTLS is inherently trust via consensus. This trust is obtained without having the other party seeing the raw data, so privacy is preserved, but it is still relying on another entity vouching for the claims. Even more, the MPC phase is a two-party computation, so it’s a single other verifier involved in the protocol, making the system theoretically vulnerable to collusion. There are various ways to alleviate this concern, be it reputation systems, random selection or multiple retrieval rounds (when the content does not change), but any solution is weaker than trust from source, which is definitely the ultimate thing we should aim for.