OSW 2024

Agenda Thursday

Talks & Tutorials Thursday

End-to-End Identities for Humans and Machines

Jonas Primbs (University of Tübingen)

Workload identities enable service authentication to improve security, especially in multi-cloud environments. We extend the concept of workload identities to make them interoperable with user identities based on OpenID Connect (OIDC) to leverage mutual authentication between workloads and users. In this talk, we introduce our approach and present applications such as end-to-end encryption through untrusted proxies, service-to-client integrity verification, and user-to-service authentication.

The central idea is the concept of an Identity Certification Token (ICT) which is a JSON Web Token (JWT) that serves as a short-term certificate [1]. An ICT contains the public key of its owner whose possession of the corresponding private key is verified by the authority that issued the ICT. For users, this authority is an OpenID Provider (OP) that verifies the user’s credentials. For services, this authority is an Attestation Server (AS) that verifies the software integrity of a service.

When a client connects to a backend service via HTTP(S), the client and the service exchange their ICTs and a proof of possession for their private key. If both are valid, the client and service have mutually authenticated themselves at the application layer. The client can then trust the integrity of the microservice, even if it is accessed through an untrusted proxy, such as an enterprise HTTP proxy, or the load balancer or reverse proxy of an untrusted public cloud or CDN provider. The service can trust the identity of the user even if a potentially malicious middleware or other services are installed upstream. Depending on the confidentiality requirements of the further communication, the client and the service can either continue with signed communication or perform a key exchange to continue with signed and encrypted communication.

This technology promises secure communication between a user’s client and a service, even if the underlying transport layers are considered insecure, e.g. because third parties can intercept or eavesdrop traffic by providing services such as proxies, load balancers, or middleware. This makes the technology useful for high-security services such as online banking, critical infrastructure control systems, medical data transfers, or for advanced microservice architectures that require cryptographic traceability.

Attendees can expect new insights into the emerging applications of the technology. The talk should encourage collaboration and drive adoption of the approach in the WIMSE architecture.

[1] https://doi.org/10.48550/arXiv.2307.16607 (Preprint, accepted for IEEE Open Journal of Communications Society, 2024)

FedCM 101

Tim Cappalli (Okta), Sam Goto (Google)

Federated Credential Management, also known as FedCM, is a Web Platform API that provides a use-case-specific abstraction for federated identity flows on the web. FedCM exposes browser-mediated dialogs that allows users to choose accounts from identity providers to login to websites. The API is being incubated at the W3C FedID CG and is actively moving towards the formation of a Working Group where it can get into a standards track.

This session will be a “101” on how FedCM works, how it interacts with federation protocols like OpenID Connect, and how it helps with ecosystem changes such as third party cookie deprecation. There will also be time to discuss security and privacy considerations, deployment challenges and opportunities that arise from the introduction of a new layer under OAuth.

We would like to introduce the work, share what we are worried about as the ecosystem picks up FedCM, and gather early guidance from this community around coexistence, interoperability, security, privacy, and user experience.

Supporting OAuth 2.0 Based Security Profiles to Open-source Software - from Implementation to Operation

Takashi Norimatsu (Hitachi, Ltd.)

In this talk, the speaker describes implementation in detail for supporting OAuth 2.0-based security profiles including FAPI security profiles (FAPI 1.0 baseline final, FAPI 1.0 advanced final, FAPI-CIBA, FAPI 2.0 security profile implementer’s draft version 2, FAPI 2.0 message signing implementer’s draft version 1) and Open Banking security profiles (Open Finance(Ex Banking) Brazil implementer’s draft 3, Australia Consumer Data Right (CDR), and UK Open Banking) based on FAPI security profiles to open-source software "Keycloak", identity and access management software.

To support many OAuth 2.0 security profiles, he investigated the relationships among them and incorporated the result of the investigation onto the implementation for supporting OAuth 2.0-based security profiles. He will tell the investigation result and how to incorporate the result to the implementation.

To realize the following points, he implemented the framework "client policies" for supporting several types of OAuth 2.0-based security profiles. He will describe the framework and talk about how to support OAuth 2.0-based security profiles by using the framework:

- Quality: Be non-invasive way, avoid affecting main code-path of the software to prevent degrading the software.

- Implementation Cost: Can support a new security profile with less costs.

- Customizability: Allow a user of the software to support their own security profile in a highly customizable way.

By using the framework, he is working on supporting OAuth 2.1 to the software. He will talk about how to support OAuth 2.1 by using the framework. Also, he will tell the insights gained through the experience of this implementation, not only the insight that are implementation dependent but also ones that are implementation independent.

Next, he will describe maintenance, audit, accountability of operation of the security profiles supported by this client policies: its UI for an administrator, logging, and verification of whether applying a security profile works properly as an administrator intends.

Finally, he will demonstrate applying an OAuth 2.0-based security profiles to a client application by using Keycloak (e.g., FAPI 1.0 Baseline, FAPI 1.0 Advanced).

Decentralizing OAuth

Andras Vilmos (SafePay, Atos)

The presentation would be about a decentralized version of OAuth. The proposed model is similar to the transition from OpenID Connect to OID4VC.

Key to the solution is the introduction of a secure element, a chipcard, on the user side. This chipcard acts both as an Identity Provider as well as an Authorisation Server during the transaction. The process flow is divided into two distinct phases: Pre-transaction and Transaction.

In the Pre-transaction phase users are authenticated at the server side and all their access privileges are loaded into a secure application running in the chip card of the users. Chip cards are by default the most secure storage spaces therefore the protection of the privileges is assured, no one, not even the users themselves can manipulate the stored privileges.

In the Transaction phase, when users need access to a protected resource they will turn to the secure element. They will be authenticated by the secure element and if approved their resource request will be verified against the stored privileges. If there is a matching stored privilege, access will be granted in the form of a unique, timestamped, signed credential (JWT) which is returned to the user. When presenting this token at a Resource Server, point of access, its signature, uniqueness, expiration, etc. will be checked and if the token is valid the user will receive access.

Ideal architecture for this model would be a mobile client application with an associated secure application (applet) running on the SIM card or embedded secure element in the smartphone. As such, this technology could be a valuable component of a mobile wallet.

The presentation will highlight identified benefits of the new model. Furthermore, the differences between the properties and requirements of decentralized authentication and decentralized authorisation will be explained, and the resulting consequences in terms of the necessary technical solutions will be introduced.

The goal is to involve the expert audience and have an active discussion about the potential benefits of the new model, and to identify use-cases which could best leverage the new technology. It would also be the objective to receive feedback, how the model and technology could be optimized, and to identify potential weaknesses, and improvement opportunities. 

The Weaknesses of OAuth in Native Apps

Aaron Parecki (Okta)

The mobile app landscape has been constantly evolving since the publication of “OAuth 2.0 for Mobile and Native Apps” in 2017. How much of the best practice advice from 2017 is still applicable today?

This session will highlight the challenges of securely handling authorization and authentication flows on mobile, as well as cover the new developments in the mobile app ecosystem that can be leveraged for higher security use cases.

While the current best practices advocate for using full HTTPS redirect URIs on mobile apps, there are certain challenges with actually implementing this using the platform-provided APIs.

We’ll dive into the details of why custom URI schemes can provide a trivial path for attackers to obtain access tokens on behalf of legitimate applications with almost no user interaction. We’ll look at how the newest iOS APIs attempt to solve this.

We’ll cover the platform-specific APIs that can be used to prevent app impersonation, and how to use these techniques with the latest “Client Attestation” drafts from the OAuth working group.

Finally, we’ll explore the new in-progress work being discussed in the OAuth working group that the enables a mobile app to take full control over the authentication experience itself, as well as why you still might not want to do this!

Why we skimmed out of SCIM for CIAM user management

Yaron Zehavi (Raiffeisen Bank International)

Dealing with multi-sourced multi-deployment (12 subsidiary banks) customer-facing applications, we needed to enable user enrollment, entitlements changes, and decommission.

We considered SCIM but chose instead to opt for custom Event and API based integration.

We'd like to share how CIAM user management works in our shop with clear diagrams and examples, what were our criteria and why we voted SCIM down.

We hope to validate our analysis with the knowledge of venerable OSW crowd, get feedback if we missed something and perhaps evoke thought towards enhancements to SCIM to incorporate patters such as event streaming as well as expand the schema to support CIAM identity use cases