Certification: Requirements for alg support in RPs/OPs

Issue #495 resolved
Joseph Heenan created an issue

https://bitbucket.org/openid/fapi/pull-requests/327 has added a third, and relatively new option of Ed25519 alongside PS256/RS256.

On the 20th April 2022 WG call it was felt that didn’t put an obligation onto the AS to support more than one of the choices (which is the current position in the certification tests), but perhaps RP tests (at least for the non-ecosystem specific profile) could require clients to demonstrate they support all 3.

Currently the certification tests don’t have any requirements around algs, we certify clients or authorisation servers if they support one of the algorithms, and we don’t obviously show which alg they used to certified. We perhaps hadn’t worried too much about that as most people/ecosystems are using PS256, and ES256 is well supported in libraries, but adding Ed25519 into the mix makes it more interesting as that’s not so widely supported.

So an alternative to requiring RPs to support all 3 would be explicitly requiring certification for each supported alg and listing which are supported. We possibly might want to move in that direction anyway, as if we were to start testing for vulnerabilities like https://neilmadden.blog/2022/04/19/psychic-signatures-in-java/ then we would have to start explicitly testing ES256 if it’s supported.

Comments (13)

  1. Filip Skokan

    I cannot get behind requiring to support all three (either OP or RP).

    Sure we can have the algorithm be one of the variants during test plan creation, and I propose we make PS256 support required in the non-ecosystem profiles (make it the default). To certify you need to support PS256 (both OP and RP). This guarantees interoperability between certified software, analogous to OIDC Core certification’s RS256. Then we can add one extra column for each of the optional algorithms, the content of which for a given row would be just ❌ , or ✅ if you, in addition to PS256, run the respective algorithm plans.

    The benefit is we have “just” 4 profiles listed, rather than 12 if the algorithm was part of the profile. Like so:

    FAPI 2.0 Baseline *P w/ MTLS Client Auth, Certificate-Bound Access Tokens
    FAPI 2.0 Baseline *P w/ MTLS Client Auth, DPoP-Bound Access Tokens
    FAPI 2.0 Baseline *P w/ JWT Client Auth, Certificate-Bound Access Tokens
    FAPI 2.0 Baseline *P w/ JWT Client Auth, DPoP-Bound Access Tokens
    FAPI 2.0 Baseline *P EdDSA Support
    FAPI 2.0 Baseline *P ES256 Support
    

    It’s also worth noting that this profile MTLS Client Auth, Certificate-Bound Access Tokens in Baseline does use any JWTs unless using OpenID (issuing ID Tokens) or JWT Access Tokens (which the suite does not care about). Which makes me think that OpenID Connect may as well be another “Support” column like the algorithms. Only its content would be “supported”, “unsupported”, or “required” depending on whether the plans were ran with or without openid, or both.

    Ecosystem specific profiles can (and probably will?) still choose any one algorithm to require, likewise require or restrict the use of OpenID Connect.

  2. Joseph Heenan reporter

    Thanks Filip - greatly appreciated your thoughts, and it would definitely be good to have simplified profiles.

    For “FAPI 2.0 Baseline *P EdDSA Support”, would that always run with (and hence require certifying OPs to support) private_key_jwt + dpop? (Because as you say, it’d be meaningless to test that with MTLS client auth & MTLS sender constrain).

  3. Filip Skokan

    I think any variants preset that results in JWTs being issued and/or verified could be used, not requiring a fixed one. (this would make certifying for the alg / openid less of a burden).

    Or, (similar to form_post profiles) every profile certified must be re-run using the algorithm / openid support.

  4. Joseph Heenan reporter

    I think any variants preset that results in JWTs being issued and/or verified could be used, not requiring a fixed one. (this would make certifying for the alg / openid less of a burden).

    I’m not sure I’m keen on this option; there’s various things that may be tricky to test in certain scenarios and I think it’s preferable for it to be clear what was certified for.

    Or, (similar to form_post profiles) every profile certified must be re-run using the algorithm / openid support.

    I’m not entirely keen on that option - it involves more effort for everyone, and more checking when we process certifications. I think picking one setup (private_key_jwt, dpop, openid) lets you test all the situations in a single run. But it is sort of awkward that in that case vendors are required to support all 3 of those options to certify for EdDSA.

    But looking at the certification tables, about 50% of vendors support/certify for everything, and the other 50% certify for one or two particular things. It might be that all vendors that would want to certify for EdDSA are the ones that are in the ‘certify for everything’ camp.

    (I think this only really applies to vendors; banks etc will certify to a particular ecosystem profile which I think will define things tightly enough that the situation should be simpler.)

  5. Filip Skokan

    Are you suggesting that in order to certify for eddsa/es256/openid one would need to add support for a profile they may not even target? I think that is an even less of a favourable option.

  6. Filip Skokan

    I think one should be able to stay only within the bounds of what they certify to get these “checkboxes”, otherwise they’re not really checkboxes and we can fall back on gazilion of profiles 😉

  7. Joseph Heenan reporter

    “Are you suggesting that in order to certify for eddsa/es256/openid one would need to add support for a profile they may not even target? I think that is an even less of a favourable option.”

    I’m not sure exactly what ‘profile’ means here, but the suggestion was that to certify for eddsa they would need to support dpop, private_key_jwt and openid.

    There’s definitely not a clear winner in any of these approaches, to my mind they all have at least one fairly significant downside.

  8. Filip Skokan

    There’s always the option to not add any dimensions and stay as-is, that is within the murky zone.

  9. Joseph Heenan reporter

    We had a good discussion about this on today’s call.

    Filip convinced me my suggestion in the previous comment (insisting vendors certifying for ES256 support openid & private_key_jwt & dpop) was sufficiently bad we shouldn’t go with it (even though it’d probably work for the vendors that certified for FAPI1, it’s not sensible to constrain ourselves to just those vendors).

    My current preferred solution is (which I made an awful job of explaining on the call, and I’ve refined slightly since the call) is something like:

    Base FAPI certification will be allowed using any permitted alg, and any combination of with/without openid / any client auth / any sender constraining.

    There will be separate test plans that aim to specifically test PS256, ES256 and EdDSA. These must be run with openid, private_key_jwt & dpop if the OP supports them (and we’ll try and check that from the server metadata and error if it looks wrong). These tests can’t be run with no-openid, mtls client auth and mtls sender constrain as in that case they don’t test anything (and they’d error if the user selected that). These tests would gain certifications in new “ES256”, “PS256” and “EdDSA” columns. These tests could potentially then also cover testing known security vulnerabilities, like using test vectors from https://github.com/google/wycheproof (but that’s a separate discussion). This also achieves the aim that people looking for a vendor solution that supports (and is certified for) a particular alg + particular set of other options can find one - and it appears not to add too much burden onto vendors, the certification team and keeps the certification page to a manageable & hopefully understandable number of columns.

    The only annoying complication is if that people certify for (say) mtls sender constrain + all algs, and later add a DPoP certification, the certification team will have to make sure they rerun & resubmit any ES256/PS256/EdDSA tests.

  10. Filip Skokan

    I believe we had a talk with Joseph recently about this and in the end concluded not to make the jws algorithm part of the profile matrix at all, same as it wasn’t part of the matrix in FAPI1.

  11. Log in to comment