JCS - IETF-104 Report
There were in total 100 minutes of meeting time (including a 1 hour
side meeting with 10+ participants) devoted to JCS at IETF-104.
Here is a list of issues raised during these meetings.
I have taken the liberty commenting them here.
For those who are not familiar with JCS
(https://tools.ietf.org/html/draft-rundgren-json-canonicalization-scheme-06),
the core rationale is simply "keeping JSON as JSON even when signed".
1. The need for clear text messages is a weak argument
The
recommended use of the current IETF JSON signature solution
(JWS) is that you:
-
Encode JSON data to be signed in Base64Url
-
Disrupt the natural structure of JSON messages by embedding
signed message data in specific signature containers
None of the Open Banking systems out there have to date chosen
this route;
they all build on variants using detached signatures and clear text JSON data.
That none of them utilize JCS is quite logical since
JCS (
correctly) is not perceived as a standard.
2. Canonicalization introduces security vulnerabilities
If a canonicalization scheme is incorrectly implemented
(irrespective in which end), the likely result is that signatures will not validate.
Broken signatures in similarity to any other input error, including missing or
incorrectly formatted data should in a properly designed application lead to a
rejected message/application failure. The core of a JCS implementation is
typically only a couple of kilobytes of executable code
making it reasonably easy to verify for correctness.
It has been mentioned that clear text data will tempt developers into trusting
(=acting upon) received data without verifying signatures.
JCS obviously does not
come with a cure for naïve developers.
See JCS Security Considerations.
In fact, the absence of clear text signatures also creates security issues as shown
by the following example from IETF's Trusted Execution Protocol WG:
https://tools.ietf.org/html/draft-ietf-teep-opentrustprotocol-02
The top element "[Signed][Request|Response]" cannot be fully
trusted to match the content because it doesn't participate in the
signature generation. However, a recipient can always match it with
the value associated with the property "payload". It purely serves
to provide a quick reference for reading and method invocation.
By using
JWS with JCS
the need for artificial holder objects and associated matching requirements
disappear, while message content is provided in clear.
3. Number serialization is a huge problem
I clearly underestimated this part when I started with JCS back in 2015, but
recently fast, open sourced and quite simple
algorithms have been developed
making number serialization according to JCS/ES6 in scope for any platform.
Extensive test data is
publicly available.
4. You should have stayed with the ES6 predictive parsing/serialization scheme
That had been cool but the sentiment among other JSON tool vendors was
that "ECMA got it all wrong" so
I was forced to select another and more conventional route.
Fortunately, the revised scheme turned out to be very simple to get running
in other platforms including Go, Python, C# and Java/Android, while leaving
parsers and serializers unchanged.
The
original concept would OTOH
require a total rewrite of the entire JSON landscape.
Sometimes "pushback" is just good 😀
5. You need a data model
JCS builds on the same a bare-bones data model for primitives as JSON
(
null
,
true
,
false
,
Number,
String),
albeit with a couple of constraints:
-
JSON Numbers MUST conceptually be treated as IEEE-754 double precision data during parsing/serialization
(which also is a generic requirement for being JavaScript compatible)
-
JSON Strings MUST (modulo escaping) be treated as immutable during parsing/serialization
This is all what is needed with respect to data models for creating reliable and interoperable "hashable" JSON.
Existing JSON-based systems use external mappings to emulate
missing data types like
int32
,
DateTime
,
Money
,
Binary
and similar.
That not all JSON applications use the same conventions
do not seem to have hampered the popularity and ubiquity of JSON.
Standardizing external mappings is another [possible] IETF activity, not related to JCS.
6. I-JSON (JCS builds on that) only says SHOULD for IEEE-754 while JCS says MUST
That is correct but if you for example send 64-bit integers expressed as
JSON Numbers to JavaScript based systems, applications will typically break
every now and then since the inherent precision is only 53 bits.
JCS was designed to also be fully JavaScript compatible.
7. XML canonicalization was a disaster
JCS is not a fullblown canonicalization scheme like XML's C14; it is
a (fairly rudimentary) serialization method.
A proper and fair evaluation should be based on the actual draft rather than
bad experiences from the XML era which BTW
also were due to other factors
such as Namespaces, Default values,
SOAP and an elaborate WS* stack which indeed took years to get
fully interoperable between vendors.
Version 1.06, Anders Rundgren 2019-05-12