A recap of the QualitaX webinar featuring Claus Skaaning, CEO of DigiShares, and Sebastián Rodríguez, VP of Product at Polygon ID.
One of the most overlooked friction points in the tokenisation of real-world assets is not the asset itself — it is the investor standing in front of it. Every time someone wants to access a new tokenisation platform, they go through KYC again. They upload the same documents, wait for the same verification, and repeat the process as many times as there are platforms they want to use. For a market that promises frictionless access to previously illiquid assets, this is a significant contradiction. In a recent QualitaX webinar, Sebastian Rodriguez (VP of Product at Polygon ID) and Claus Skaaning (CEO of DigiShares), joined by blockchain developer Martin Hot’ka, explored how the Decentralised Identity for Tokenization (DITO) framework is attempting to solve this — and why getting identity standardisation right is as important as getting the asset tokenisation right.
The Identity Problem Is Political, Not Technical
Sebastian opened with an answer that reframes the entire conversation: centralised identity systems are not technically broken. KYC providers, identity verification platforms, and centralised credential stores work well within their own boundaries. The problem is what happens when you try to build an open, interoperable financial ecosystem on top of a foundation where every identity is controlled by a single private company.
The illustration he offered is one most people immediately understand: think about your Gmail or Apple email address. It has become your de facto online identity across hundreds of services. Now imagine being permanently locked out of it overnight. Today that would be disruptive. But as we move towards digital economies where government services, financial accounts, investment platforms, and regulated transactions all depend on verified digital identity — concentrating that power in a handful of private companies becomes untenable. The challenge is not technical. It is about who controls infrastructure that the entire economy depends on.
This concern is not hypothetical. It maps directly onto the strategic concerns of regulators in the EU, UK, Singapore, and Hong Kong about the concentration of critical digital infrastructure in the hands of a small number of predominantly US-based technology companies. Decentralised identity is, at its core, a response to that political reality.
What the Tokenisation Market Actually Needs
From the DigiShares perspective — with a client network of close to 200 companies globally using their platform — Claus outlined the consistent requirements coming from the tokenisation industry:
Automation and cost reduction are the first tier. The manual processes in asset issuance, compliance management, and investor onboarding represent a significant operational burden that tokenisation should be able to eliminate or dramatically reduce.
Fractionalisation is the second driver, and it operates at different scales depending on the client. Large private equity firms want to reduce minimum ticket sizes from $1 million down to $50,000 — opening institutional products to a wider qualified investor base. Real estate developers want to go further still, down to €100 or $100, enabling genuinely retail participation in asset classes that have historically been inaccessible.
Liquidity is the holy grail that the market has not yet fully delivered. The major exchanges — London Stock Exchange, SIX Swiss Exchange, and others — are now building tokenised marketplaces, and secondary market liquidity is coming. But it is arriving more slowly than many anticipated.
On the identity side, Sebastian framed the core challenge around reusable KYC. Today, every financial platform that requires compliance verification is an island. KYC is expensive — far more expensive than most people realise, encompassing not just face and document verification but creditworthiness assessment, sanctions screening, PEP checks, and more. This cost has a direct impact on financial inclusion: the onboarding cost for a small business or individual in a lower-income economy can exceed the revenue opportunity they represent, so they are simply excluded. The ideal outcome is a world where a user completes KYC once, receives a verifiable credential, and can present that credential to any platform that accepts it — without re-uploading documents or re-running checks.
How the DITO Framework Works
The DITO framework — announced by DigiShares in partnership with Polygon ID shortly before this webinar — is designed to make reusable KYC a practical reality for the tokenisation ecosystem specifically. It has two core components.
Verifiable credential schema: A schema is a structured definition of what a credential contains — which fields, what data types, what formats. For KYC credentials to be interoperable across platforms, everyone needs to be using the same schema. The DITO framework defines that schema for the tokenisation context, publishing it on IPFS (the decentralised storage network) so that it is permanent, immutable, and owned by no single party. Once published, the schema becomes a public good — no company can modify it, take it down, or use it as a commercial lever. Forks are possible, but the original remains as the agreed baseline.
Trust registry: A schema alone is not sufficient for interoperability. A platform receiving a credential also needs to know whether the issuer of that credential is trustworthy — i.e., whether they are a legitimate, compliant KYC provider in the relevant jurisdiction. The DITO framework is building a trust registry that governs which issuers are recognised as trusted credential providers. Governance of the registry is separate from DigiShares and Polygon ID — the aim is a structure where no single company controls who is included or excluded.
The on-chain verification flow: Martin walked through the technical implementation. An investor onboards to a platform, completes KYC with a recognised provider, and receives a verifiable credential stored in their identity wallet. When they subsequently join a second platform that participates in the DITO framework, they present their credential — and the platform verifies it on-chain without ever accessing the underlying KYC data. The investor’s documents and personal information never leave their wallet. The platform only receives cryptographic proof that the credential is valid and was issued by a trusted provider.
Zero-Knowledge Proofs: Verification Without Exposure
The cryptographic mechanism enabling this privacy-preserving verification is zero-knowledge proofs (ZKPs) — a core component of the Polygon ID stack that Polygon ID has been developing for over four years.
Martin offered the clearest explanation of what ZKPs mean in practice: imagine buying an age-restricted product in a shop. In today’s world, you hand over your ID and the cashier sees your full name, date of birth, address, and everything else on the document. With a zero-knowledge proof, the cashier receives a cryptographic proof that you are above the required age — and nothing else. The proof is mathematically verifiable. The cashier cannot infer your exact birth date, your name, or any other attribute. The identity holder discloses only what is necessary for the specific transaction.
Applied to financial services compliance, this is transformative. A platform can verify that an investor has passed KYC, is not on a sanctions list, and meets the eligibility criteria for a specific asset — without ever receiving or storing the investor’s personal data. This has direct implications for GDPR compliance: because the verifier never has access to the underlying credential data, the data minimisation and storage limitation requirements of GDPR are satisfied by design.
The Compliance Architecture: Authorisation, Freeze, and Governance
A thread running through the webinar was how the DITO framework interacts with the regulatory requirements of real securities markets. DigiShares primarily tokenises real estate assets through SPVs — meaning the tokens represent shares in a legal entity that holds the underlying property. This is inherently a regulated activity in every jurisdiction.
Claus was direct about the limits of decentralisation in this context: a regulated securities issuer cannot fully decentralise the management of a share register. Regulators require a defined party to be responsible for maintaining the cap table. KYC cannot be removed from the process. Wallet whitelisting is required. The DITO framework operates within these constraints rather than trying to circumvent them — decentralising what can be decentralised (credential storage, verification logic, schema governance) while maintaining the legal accountability structures that regulated assets require.
Sebastian’s framing on the decentralisation question was the most conceptually useful: at the infrastructure layer, decentralisation matters primarily because it prevents censorship. At the application layer, the more relevant concept is trustlessness — not whether a platform is technically decentralised, but whether its architecture prevents it from doing the wrong thing. A platform that is auditable, transparent, and whose behaviour is enforced by smart contract logic rather than discretionary human decisions offers meaningful trust guarantees even if it operates within a centralised compliance framework. Trustlessness, he argued, is what satisfies the underlying concern that regulators and investors have about centralised systems.
Interoperability: The Last Mile Problem
Sebastian identified what he called the standards last mile as the central challenge facing identity interoperability in financial services. Numerous standards exist — verifiable credentials, DID methods, schema formats — but standards by themselves do not create interoperability. Someone has to make the specific, often unglamorous decisions about field naming conventions, data formats, and implementation details that allow two different systems to actually exchange data without errors. Those decisions tend not to get made at the standards body level, which means they fall through the gaps and fragment the market.
The DITO framework is explicitly trying to do this last-mile work. By convening KYC providers, tokenisation platforms, and wallet providers around a single published schema and a governed trust registry, it is making the specific implementation decisions that allow the abstract standard to become real interoperability. Neither DigiShares nor Polygon ID claims ownership of the outcome — the stated objective is a schema that belongs to the industry, governed as a public good, that any participant can implement.
On EVM compatibility: the framework currently works across all EVM-compatible chains, including Ethereum, Polygon, and others. It does not currently support non-EVM chains, which is a meaningful constraint as the tokenisation market spans multiple blockchain ecosystems.
Regulatory Compatibility: A Work in Progress
An audience question raised the European Digital Identity Wallet (EUDI ARF) — the EU’s own framework for digital identity infrastructure, which is expected to become mandatory across member states. Sebastian’s answer was candid: the DITO framework is not currently compliant with the EUDI ARF, primarily because that framework has changed significantly multiple times and has not yet reached a stable final version. Polygon ID’s approach has been to build on the underlying protocol, adopt standards once they are mature, and then align with regulatory frameworks once those frameworks are stable enough to build against. Compatibility with the EUDI ARF is on the roadmap but has not yet been implemented.
The broader point, which Claus reinforced, is that the national and regional digital identity schemes emerging globally are unlikely to converge on a single global standard. The EU’s framework, Singapore’s approach, and other national schemes will each reflect local regulatory requirements and political priorities. The DITO framework is designed to operate across jurisdictions — precisely because no single national scheme will serve global tokenisation markets.
What’s Coming: Release 7 and the Open Finance Vision
Polygon ID Release 7 (due approximately every two months) has a strong focus on user experience — bringing the technology closer to mainstream usability. The longer-term roadmap centres on what Sebastian described as the open finance vision: a world where all the credentials and records needed to interact with financial services are held in one place, under the user’s control, with consent management that allows the user to selectively share specific data with specific services. Upcoming work includes multi-device credential management, integration with popular authentication methods (MetaMask and others), and a shift from assuming a technically sophisticated web3 user to designing for a much broader general audience.
DigiShares is tracking every new Polygon ID release in near real-time, integrating new features as they become available, and pushing the DITO standard out to its full client network of close to 200 companies. An exchange for trading tokenised real estate assets is also in development, which will incorporate the DITO framework as the identity layer for investor onboarding and secondary market participation.
Key Takeaways
- The centralised identity problem is political, not technical: concentration of identity infrastructure in private companies is incompatible with open, interoperable digital economies
- Reusable KYC — complete verification once, use the credential across multiple platforms — is the core objective the DITO framework is designed to deliver
- The framework has two essential components: a publicly governed, immutable credential schema (published on IPFS) and a trust registry defining which issuers are recognised as compliant credential providers
- Zero-knowledge proofs enable platforms to verify investor credentials without accessing the underlying personal data — GDPR compliance by design
- Trustlessness matters more than decentralisation at the application layer: a regulated securities platform cannot be fully decentralised, but it can be architected so its behaviour is mathematically verifiable rather than discretionary
- The DITO framework is designed as a public good — no single company controls the schema, the governance, or the commercial terms
- EVM-compatible chains are supported; non-EVM chains are not currently in scope
- Regulatory alignment with EUDI ARF is on the roadmap but not yet implemented — the framework is built to adapt as regulatory standards stabilise
- The standards last mile — the specific, unglamorous implementation decisions that turn abstract standards into real interoperability — is precisely what the DITO framework is designed to deliver.