OSW 2024

Agenda Friday

Talks & Tutorials Friday


Some implementation experiences from Norway

Jørgen Binningsbø (Norwegian Digitalisation Agency)


The Norwegian Digitalisation Agency operate three authorization servers; one for citizens logging in to public services, one for machine-to-machine communications between public and private sector, and one for business representations cases. In all three ASes, electronic ID and electronic seals conforming to the European eIDAS regulation are key for identification and authentication.


We will give a short story on how we've built these systems, which typical use cases they cover, and how we used or probably misused the specifications to solve our use cases. Specifically, we'll cover the use of x5c in JWT bearer grants, our microservice architecture leveraging an internal custom PAR optimization, some RAR implementation experiences in OIDC (not oauth2...) and finally our plans on utilizing RAR to achieve more fine-grained auth also for machines.



Fully-Specified Algorithms for JOSE and COSE

Mike Jones (Self-Issued Consulting, Connect2id)

The IANA algorithm registries for JOSE and COSE contain two kinds of algorithm identifiers:


- Fully Specified: Those that fully determine the cryptographic operations to be performed, including any curve, key derivation function (KDF), hash functions, etc. Examples are RS256 and ES256K in both JOSE and COSE and ES256 in JOSE.


- Polymorphic: Those requiring information beyond the algorithm identifier to determine the cryptographic operations to be performed. Such additional information could include the actual key value and a curve that it uses. Examples are EdDSA in both JOSE and COSE and ES256 in COSE.

This matters because many protocols negotiate supported operations using only algorithm identifiers. For instance, OAuth Authorization Server Metadata uses negotiation parameters like these:


"token_endpoint_auth_signing_alg_values_supported": ["RS256", "ES256"]


OpenID Connect Discovery, W3C Web Authentication, and FIDO2 likewise perform negotiations using algorithm identifiers. This does not work for polymorphic algorithms. For instance, with EdDSA, you do not know which of the curves Ed25519 and/or Ed448 are supported! This causes real problems in practice.


In this session, I will report on the work in the IETF JOSE working group to move both JOSE and COSE to fully-specified algorithms. I want to listen to which classes of algorithm problems attendees believe we need to and don’t need to solve in this work. For instance, do we need specific algorithm identifiers for Elliptic Curve Diffie-Hellman key agreement (ECDH) with particular kinds of ephemeral keys? You tell me! I expect a highly interactive session!


High-security & interoperable OAuth 2: What’s the latest?

Joseph Heenan, Daniel Fett (Authlete)

OAuth is a widely used authorization framework that enables third-party applications to access resources on behalf of a user. However, it has historically been difficult to meet very high security and interoperability requirements when using OAuth. Daniel and Joseph have spent much of the last six years working to improve the state of the art and will present the latest developments in the field.


There are challenges when trying to achieve high security and interoperability with OAuth 2: There are many potential threats, some not part of the original OAuth threat model. For seamless authorizations, optionality must be minimized in OAuth itself and also in any extensions used.


Seven years ago, the IETF OAuth working group began work on the Security Best Current Practice document and more recently on OAuth 2.1. Meanwhile, the OpenID Foundation has created FAPI1 and FAPI2 security profiles.


We will help you understand the focus of each document and when to use which. We show how to achieve on-the-wire interoperability and security using techniques like asymmetric client authentication and sender-constraining via DPoP and MTLS, discussing the benefits and potential disadvantages of each. We highlight the benefits for implementers and the role of conformance testing tools.


Detecting Workload Identity Theft with Transaction Binding

Pieter Kasselman (Microsoft)

The increasing prevalence of cloud computing and micro service architectures has led to the rise of complex software functions being built and deployed as workloads deployed in Kubernetes or virtual machines. Workloads are frequently over-permissioned and under-governed. As a result large numbers of workloads have more access than they should have and are often persisted, even when they are no longer used. If an attacker can assume a workload identity they can obtain broad access, make lateral moves and do all of this at a low risk of being detected. This makes compromising a workload identity very attractive.


One way in which workload identities are compromised is by stealing their credentials and replaying them outside the context of the original transaction in an attempt to gain access to additional resources. Today these credentials are often bearer tokens such as SVID JWTs, whose only protection is that they are relatively short lived. Today, a recipient of a workload identity such as a SVID JWT has no way to know if the token is being replayed or whether it is being presented in the context of the original transaction.


In this talk we will consider the scenarios and requirements for different proof mechanisms that workloads can use to provide proof that the request is being presented in the context of a specific transaction. We will explore approaches that builds on existing approaches such as DPoP, but is optimised for workloads that is used with different types of transport mechanisms and widely deployed workload identity token types such as SPIFFE SVIDs. This will serve as a point of departure for a discussion on how to secure these workload identities and how we can use standardised approaches that are formally verified to achieve this, as well as which new standards we may need to create in new working groups such as a the Workload Identity in Multi-Service Environments (WIMSE) in the IETF.


OpenID for Verifiable Credentials: Achieving interoperability, security and scalability

Joseph Heenan (OpenID Foundation)

Digital wallets and verifiable credentials are currently a hot topic in many jurisdictions around the world, with work ongoing in ISO, the EU, Japan, USA and many more that leverages the OpenID Foundation (OIDF) for Verifiable Credentials standards. OIDF has a history of creating conformance tests and certification programmes for OpenID standards that can be used to ensure that ecosystems made of up many potentially divergent implementations of the standard can scale up quickly.


OIDF is currently working on tests for the OpenID for Verifiable Presentations, OpenID for Verifiable Credential Issuance and OpenID4VC High Assurance Interoperability Profile (HAIP) specifications to ensure that deployments of these protocols are both interoperable and correctly implement the security properties. Joseph talks about the approach being taken, demonstrates the progress to date, shares the future roadmap and how implementors can run the current tests.


The Past, Present, and Possible Future of Proof of Posession in OAuth

Justin Richer (Bespoke Engineering)

Sending an OAuth access token is one of the most important parts of the protocol, and key to its simplicity for developers. OAuth 1 had signed requests, OAuth 2 gave us bearer tokens, and many of us have used MTLS and DPoP. But the history of OAuth token presentation is littered with other attempts, and the future might move things in a different direction all together. What is out there, and what is coming?