OpenBadges, W3C VCs, and the Standards That Matter for Your Institution
A non-technical guide to the credential standards landscape - what's open, what's proprietary, what to demand from your platform, and why standards are the single most important question to ask before signing anything.
If you ask the average vendor of a digital credential platform whether their solution is "secure and standards-based," they will say yes. If you ask which specific standards, in which version, with which compliance level, you will start to learn whether they actually mean it.
Standards are the unsexy, load-bearing part of credential infrastructure. They are also the single most important question for any institution evaluating a platform. Get the standards right and your credentials remain useful even if your vendor disappears, your platform changes, or your institution merges with another. Get them wrong and you end up with a million credentials that are a million proprietary problems.
This piece is a non-technical guide to the credential standards landscape - what they are, why each matters, and what to demand from any platform you sign with. It is written for institutional decision-makers who are not cryptographers and don't need to be, but who do need to ask the right questions.
Why standards matter at all
The first thing to internalise: the value of a credential to its recipient comes mostly from how widely it can be verified. A credential that can only be verified through the issuer's own portal is worth less than one that can be verified through any standards-compliant verifier in the world.
Think of it like a passport. A Ghanaian passport is valuable because immigration authorities in 195 countries can read it and confirm it is genuine. A travel document issued by a private company that only the company's own staff can read is, by comparison, useless. The standards make the document useful.
The same logic applies to digital credentials. The W3C Verifiable Credentials standard is the international "passport" format. OpenBadges is the international "membership badge" format. Credentials issued in those formats can be read, displayed, and verified by tens of thousands of independent verifiers - LinkedIn, employer ATSs, government registries, embassy systems, professional body checkers - without any bilateral agreement.
The alternative - proprietary formats, locked to a particular vendor's portal - gives you credentials that are theoretically digital but practically isolated. Your graduates can't share them on LinkedIn natively. Foreign employers can't verify them without your portal being up. The credential's reach is narrowed by every layer of proprietary lock-in. Read more about how this affects African institutions in our piece on why Africa will lead the digital credentials revolution.
The two standards that matter
In 2026, two standards dominate. They overlap, they interoperate, and a good platform supports both.
W3C Verifiable Credentials Data Model
Pronounced "VC" or "verifiable credentials," this is the more general-purpose standard. It defines a structured format for any kind of credential - academic, professional, identity, attestation - and a cryptographic proof system that lets verifiers confirm authenticity without contacting the issuer.
The standard was finalised as a W3C Recommendation in 2019 and updated to version 2.0 in 2024. It is the foundation of the EU's eIDAS 2.0 digital identity framework, India's national academic credentials system, and a growing number of professional body deployments globally.
What W3C VCs give you:
- A standard JSON-LD structure that any compliant verifier can parse.
- Multiple signature suites supported (Ed25519 is the most common modern default).
- A pluggable revocation mechanism via status registries.
- Support for selective disclosure (a recipient can prove a credential exists without revealing every field - useful for, say, proving you're over 18 without revealing your birthday).
- A formal extension model so industries can add their own fields without breaking interop.
When a vendor says they "support W3C Verifiable Credentials," they should be able to point to the specific data model version (2.0 is current), the signature suite (e.g., Ed25519Signature2020), and the credential schema they use. If they cannot, they probably don't.
OpenBadges 3.0
OpenBadges started life as a Mozilla project in 2011, focused on representing achievements as portable digital badges. Version 3.0, released in 2023, was a major rewrite that aligned OpenBadges directly with W3C Verifiable Credentials. In effect, an OpenBadge 3.0 is a W3C Verifiable Credential, with additional fields specific to skill and achievement representation.
This alignment is one of the most important things to have happened in the credential standards space in the last five years. It means an OpenBadge 3.0 credential can be verified by any W3C VC verifier and displayed by any OpenBadge consumer. The earlier fragmentation (OpenBadges vs Verifiable Credentials, two separate ecosystems) is now resolved.
What OpenBadges 3.0 gives you on top of W3C VCs:
- A standard structure for representing skills, achievements, and learning outcomes (criteria, alignment to skills frameworks, evidence).
- Native support in major learning platforms (LinkedIn, Credly, Badgr, Canvas).
- Established conventions for image-based badge display, which matters for skills-oriented credentials.
For most institutions issuing credentials in 2026, the right answer is "we support both" - issue most credentials as W3C VCs with OpenBadges 3.0 fields where appropriate, depending on the credential type.
The signature standards
A credential standard tells you what shape the credential is. A signature standard tells you what kind of cryptographic proof is attached. This is where vendors sometimes hide weakness.
The current best-practice signature suite for W3C VCs is Ed25519Signature2020 (or its successor eddsa-rdfc-2022). Ed25519 is the elliptic-curve algorithm we covered in detail in our explainer on tamper-proof credentials. It is fast, small, well-vetted, and now standard across modern verifiable credential implementations.
Acceptable alternatives, depending on context:
JsonWebSignature2020(JWS-based) - fine, broadly supported, slightly larger signatures.BbsBlsSignature2020- for selective-disclosure use cases, more advanced.
Things to be wary of:
- "Custom" or "proprietary" signatures. There is no good reason for a credential platform to invent its own signature scheme.
- RSA-only platforms. RSA still works but is heavier; new deployments default to Ed25519.
- "We use blockchain to sign" statements without further detail. Sometimes legitimate, often marketing fluff. Ask which specific blockchain, how the signing keys are managed, what happens if the chain forks. The answers are usually less impressive than the headline.
The identifier standards
Credentials need a way to refer to issuers and subjects. The standard mechanism is the Decentralised Identifier (DID), a W3C standard for identifiers that don't depend on a central authority.
A DID looks like did:web:university-of-ghana.edu.gh or did:key:z6MkpTHR8VNs.... The institution publishes a DID document containing its public keys. Verifiers resolve the DID to find the keys.
The two most useful DID methods for institutional credentials:
did:web- uses the institution's own domain. Simple, hosted by the institution itself. Anyone can verify by fetching the DID document from the institution's website.did:key- encodes the public key directly in the identifier. Self-contained. Useful for credentials where the issuer is authoritative on its own.
Either is fine for most institutional use. What matters is that the platform uses W3C-compliant DIDs at all, rather than its own internal identifiers that only its own verifier can resolve.
What "interoperable" actually means
When a vendor says their credentials are "interoperable," there are at least three different things they might mean:
Read-interop. Credentials issued from this platform can be read and parsed by other standards-compliant verifiers. Easy bar to clear; most platforms claim this.
Verify-interop. Credentials can be cryptographically verified by any third-party verifier without needing this platform's specific tools. This is the meaningful bar. To verify a credential, a third-party verifier needs to (a) parse the credential structure, (b) resolve the issuer's DID and public key, (c) check the signature using a standard algorithm, and (d) check the revocation status. If any of those steps requires the original platform's proprietary infrastructure, you do not have verify-interop.
Display-interop. Credentials can be visually displayed by other systems - LinkedIn, Credly, badging platforms. This typically requires OpenBadges 3.0 or similar. Useful but secondary; a credential that displays beautifully but can't be cryptographically verified is just a picture.
The right test for any platform is: can a verifier I have never spoken to, in a country I have never been to, verify a credential issued by my institution using nothing but standards-compliant open-source tools? If yes, you have meaningful interoperability. If the verifier has to come back to your platform's API to confirm anything, you don't.
Standards-compliant from day one
Avogy issues credentials as W3C Verifiable Credentials with OpenBadges 3.0 alignment, signed with Ed25519, anchored to your institutional DID. Verifiable by anyone, anywhere, forever.
See the standards pageWhat the standards do not do
It is worth being clear about what standards don't solve.
Standards do not stop fraud. They make fraud detectable. The signature on a credential proves it was issued by the institution, not that the institution issued it correctly. If a registrar makes a mistake, or is bribed, the credential is still cryptographically valid - it just records an institutional error. Standards make the audit trail clean; they do not eliminate institutional governance failures.
Standards do not guarantee long-term verifiability automatically. They make long-term verifiability possible if the institution publishes its public key durably. If your institution stops publishing its key, or your domain expires, verification will start failing for credentials older than the key cutover. Best practice is to publish public keys in multiple places (institutional website, sectoral registry, archive services) and to commit to the long-term publication of historical keys.
Standards do not make the recipient experience automatically good. They give you the building blocks. The platform on top still has to build a clean claim flow, a useful sharing experience, and a friendly verification page. A standards-compliant platform with a terrible UX is still a terrible platform; it just at least gives you the freedom to migrate to a better one.
A buyer's checklist
Here is the short list of things to ask any platform you are evaluating, with the right kind of answer in brackets:
- Which version of W3C Verifiable Credentials do you support? (2.0 or later.)
- Do you support OpenBadges 3.0? (Yes, with examples.)
- Which signature suites do you use? (Ed25519Signature2020 or eddsa-rdfc-2022, primarily.)
- Can a third-party verifier confirm credentials issued through your platform without using your own tools? (Yes, with documentation pointing to standard verification libraries.)
- Where are issuer DIDs hosted, and how are public keys published? (
did:webon the institution's domain, with optional registry mirrors.) - How is revocation handled? (Status List 2021 or equivalent, polled by verifiers in real time.)
- If your platform shut down tomorrow, would credentials issued through it remain verifiable? (Yes, because public keys are independently published.)
- Can you provide an example credential I can verify using a third-party open-source tool right now? (Yes - they should be able to do this on the call.)
The point of this checklist is not to make procurement painful. It is to filter out platforms whose business models depend on lock-in. The reputable vendors will breeze through these questions; the ones who hesitate are telling you something important.
A note on national and regional standards
Several governments have proposed national credential standards. South Africa's NQF Framework. Kenya's revised TVET Qualifications Framework. Ghana's draft National Qualifications Framework, in early discussion. Some of these are useful reference points for skill mapping. None of them should replace the international technical standards.
The reason is interoperability. A credential that is "compliant" with a national framework but not with W3C VCs is verifiable only within that country. A credential that is W3C VC-compliant and maps to the relevant national framework gets you both - international verifiability and national alignment.
Be wary of any national initiative that proposes building a centralised national credential platform with non-standard formats. These initiatives are well-intentioned and almost always fail. Ten years from now, the country will be migrating off the bespoke platform onto open standards. Better to start there.
What about ISO?
ISO has standards in the credential and identity space - ISO/IEC 23220 for mobile driving licences, ISO/IEC 18013 for various identity formats. These overlap with W3C VCs in some areas. For institutional educational credentials, W3C VCs and OpenBadges 3.0 are the relevant standards. ISO formats become relevant for government-issued identity credentials (mobile IDs, residence permits, etc.) which is a related but separate category.
What this means for your institution
Standards-readiness is not a technical detail. It is the difference between a credential that compounds in value over time - accepted by more verifiers every year, integrating with more platforms, surviving vendor changes - and one that quietly depreciates as the proprietary ecosystem around it shifts.
The institutions that get this right are doing three things:
Demanding open standards in procurement. Not as a checkbox but as a hard requirement, validated by third-party verification tests during evaluation.
Publishing their public keys durably. On their own website, in sectoral registries, in archive services. Treating the public key as part of the institution's permanent identity, not a vendor configuration.
Educating their leadership. Making sure the registrar, the IT lead, and the governing board understand why standards matter, so the next procurement decision doesn't undo the work of the last one.
These steps are not expensive. They are mostly a matter of asking the right questions at the right time. The institutions that ask them are the ones whose credentials still verify in 2050.
A short closing thought
Standards are quiet and unsexy. They are also, by far, the most important determinant of whether your institution's investment in digital credentials pays off over decades.
A platform with great UX and proprietary formats is a problem dressed up as a solution. A platform with adequate UX and full open-standards compliance is the foundation of a real institutional credential strategy.
Choose accordingly. Ask the questions. Verify the answers with a third-party tool. The technology is open. The standards are clear. The hardest thing your institution has to do is insist on getting them right.
Related reading: The anatomy of a tamper-proof digital credential, From paper to pixel: a step-by-step guide for issuing institutions, Why Africa will lead the digital credentials revolution.
Ready to issue tamper-proof credentials?
Avogy helps universities, professional bodies and training providers issue credentials anyone can verify in seconds - no login, no app, no phone calls.
Start issuing credentials