testability of non-guessability requirements
The CIBA spec contains phrasing like twice:
contain sufficient entropy (at least 128 bits) or be otherwise protected such as to make brute force guessing computationally infeasible
That requirement is not easily testable by a conformance suite. We could use wording more like FAPI uses, to quote the FAPI spec:
shall provide opaque non-guessable access tokens with a minimum of 128 bits of entropy where the probability of an attacker guessing the generated token is less than or equal to 2^(-160) as per RFC6749 section 10.10;
It may be the actual limits here are more than necessary for CIBA core, however with phrasing like this the conformance suite can at least do a test on the amount of entropy, which I believe helps avoid insecure implementations.
The proposed text from FAPI Profile seems better to me but I would defer to Dave (or Brian) for final judgement.
I actually don't fully understand the FAPI text to be honest. So not too keen on taking it as is.
But to allow for the "conformance suite can at least do a test on the amount of entropy" I'd propose changing the normative language in those two places to a MUST (from a SHOUD and should). So the client_notification_token would have:
@josephheenan does that get at what you were asking for? Or did I miss the point?
I think it's going in the right direction. From a testability point of view the "or" is still problematic; what I really want is for a test to be able to fail if there's not 128 bits of entropy, and the "or" prevents that.
What is the part after the "or" really trying to address? If it's to allow jws then I'd expect any jws signed with any suitable key to have more than 128 bits of entropy anyway.
Yes, the "or" is to allow for a JWS or AEAD JWE or something along those lines if the implementation wants to encode some state in the thing. I don't really think of those as having entropy per se but I guess in a way the AEAD tag or signature part does kinda.
I'm not sure how the conformance suite tests for entropy exactly - a length check? - but I think the text with the MUSTs would very reasonably allow for a test to be able to fail if there's not 128 bits of entropy.
The conformance suite measures the Shannon entropy ( http://bearcave.com/misl/misl_tech/wavelets/compression/shannon.html ). I'm fairly certain. we've had tests run against JWS and they've passed no problem - as you say, any cryptographically strong signature will be high in entropy, and similar for encrypted objects too).
I think on that basis we could drop the 'or' part?
I maintain that the 'or' part is useful guidance to implementers letting them know that the thing doesn't necessarily have to be a reference to some state but could be a cryptographically secured self contained thing. And that the text with the 'or' part and a
MUSTabsolutely still allows for a Shannon entropy test or similar for > 128 bits.
I did the should/SHOULD to MUST in cc887b3
I'm with Brian on this one. If there is any wilful misunderstanding by implementers we can always point them to this discussion!
In my heart of hearts I think this one can be marked as resolved. But I'm going to put the "CIBA Post-Implementer's Draft" milestone on it to get to a clean issues list to move ahead with the Implementer's Draft. But leave the issue open to allow Joseph to argue about it more going forward. If he really wants. Or he could mark it as resolved. Whatever works.
I shall keep a note of the above comment to put next to the test :-)
I've probably not fully explained that my worry is around the "or be otherwise protected such as to make brute force" part. "otherwise protected" generally allows for other mechanisms as well. The text from RFC6749 (which is where some of the awkward text in FAPI comes from) is:
I've previously had it explained to me that this text doesn't mean a token must have 128 bits of entropy and that a shorter token is allowed by this text if there's a different mechanism (rate limiting, blacklisting, etc) that reduces the overall probability of an attacker stumbling onto a working token. (The text also has an interesting effect that if you're issuing a huge number of tokens you need to have more than 128 bits of entropy if you have no other mechanisms in place.)
(For the record, I'm fine with post implementer's draft; I think we're now clear on the intention of the text anyway)
Thanks @josephheenan, we can take another look at the specific language down the road and I do think we're clear enough on the intention of the existing text.
I'll have another crack at this in the interests of trying to get it closed off. My worry is to avoid getting into an argument with an implementor because they don't have 128 bits of entropy, fail the test and then argue that they're complaint with the standard as they're "protecting such as to make brute force guessing computationally infeasible", which to me is quite a vague statement and can arguably be met by using a 64bit token and having the AS severely rate limit guessing attacks - something that's not easily verifiable by testing.
How about this wording:
Pull request #61 has some text that is similar in spirit to the suggestion by @josephheenan