Hi Thomas
Hi Eliot, thanks very much for the review. One quick comment on this point: On Sun, Jun 5, 2022 at 1:57 PM Eliot Lear via Datatracker <noreply@xxxxxxxx> wrote:The most major problem with the document is this: 7. Profiles This EAT specification does not gaurantee that implementations of it will interoperate. The variability in this specification is necessary to accommodate the widely varying use cases. An EAT profile narrows the specification for a specific use case. An ideal EAT profile will guarantee interoperability. This is quite counter-cultural to the IETF. You start with the smallest set of functionality and then expand outward to cover different use cases that make use of different extensions. I'm not saying that profiles would not be necessary, but that some additional thought be given to extension mechanisms.Maybe what is not immediately clear is that EAT is not a complete protocol but a framework.
What is provided is a specification of a token. That's how I
reviewed it.
The EAT framework provides: * A type system -- the base claims-set & a few aggregation types; * Security envelopes based on COSE, JOSE; * CBOR and JSON serialisations; * A number of pre-defined semantics (the defined "claims") that one can readily reuse.
All good. In fact, that's so well stated that perhaps you should say it in the draft just so.
So, a mechanism to identify specific kinds of EAT-based PDUs needs to be there from the onset, otherwise one wouldn't know how to instantiate the framework for their use case. And that's precisely the role of the profiles.
I'd suggest that Section 7 is still problematic as specified. Let's start with Section 7.2.1:
The profile should indicate whether the token format should be CBOR, JSON, both or even some other encoding. If some other encoding, a specification for how the CDDL described here is serialized in that encoding is necessary. This should be addressed for the top-level token and for any nested tokens. For example, a profile might require all nested tokens to be of the same encoding of the top level token.
Can you give an example of when this would not be entirely clear from context? Consider this: YANG serializes into XML and JSON. but we do not specify different YANG modules for the two serializations. You can express the serialization of the CDDL for different formats (as you do), but that's different from profiling.
For an example of a profiled EAT that builds on the EAT framework to create (demonstrably) interoperable attestation evidence see the PSA token [1]. [1] https://www.ietf.org/archive/id/draft-tschofenig-rats-psa-token-09.html
Yes, this feels like a classic profile.
This statement in particular is quite disturbing. In some cases CDDL may be created that replaces CDDL in this or other document to express some profile requirements. Not only is this counter-cultural, but it would require an Updates: header on any such profile, and would further just be plain out confusing.I don't think the "Updates" would be required: the CDDL defines a type constraint that is applicable to the specific profile, it doesn't modify the base type.
But that is precisely what the text I quoted states.
See for example the way PSA restricts the nonce claim [2]. https://www.ietf.org/archive/id/draft-tschofenig-rats-psa-token-09.html#section-3.1.1
As an aside, I think I should congratulate you for actually generating compliant SVG graphics!
Coming more to the point, why is it the working group could not settle on many of the contents inside that profile document? This profile seems like an out for the working group not having resolved some differences. Are there those who want nonce values other than 32, 48, or 64 bytes? If so, what brings about the difference and can it be resolved?
Also, some of the contents of the profile you refer to demonstrate the peril: a nonce can be presented in three different ways. Why? Why does it matter that you not use an array when conveying a single nonce? All that does is add additional branches. Worse, if parsing has to occur based on multiple profiles, as will happen, the amount of code needed to do this is likely to balloon.
Eliot
In short, the profile mechanism is harmful to the very concept of interoperability.On the contrary, without profiles it would be probably impossible to interoperate. Cheers, thanks,
Attachment:
OpenPGP_signature
Description: OpenPGP digital signature
-- last-call mailing list last-call@xxxxxxxx https://www.ietf.org/mailman/listinfo/last-call