To start with, a “kid” is a key id within a JSON Web Key Set (JWKS). Within the OpenID Connect protocol (which is kind of like an OAuth2 extension) Authentication Services can ensure the data integrity of their JWT tokens by signing them. And they sign the tokens with a private certificate. The token signature can then be verified using a public certificate; this is somewhat similar to SSL certificates for websites over https. With https, the public certificate is immediately given to the browser as soon as you navigate to the website. But, for JWT tokens your application will have to go “look up” the certificate. And that’s where OpenID Connect comes in.
OpenID Connect’s goal wasn’t to standardize JWT tokens or the certificate signing, it was a secondary feature for them. However, the secondary feature was pretty well done and OAuth2 enthusiasts adopted that part of the protocol while leaving the rest of it alone. OpenID Connect specified that their should be a “well known” endpoint where any system could go look up common configuration information about an Authentication Service, for example:
https://accounts.google.com/.well-known/openid-configuration
One of the standard values is ‘jwks_uri`, which is the link to the JSON Web Key Set. In this case:
https://www.googleapis.com/oauth2/v3/certs
In the example above, the entire certificate is in the `n` value. And the `kid` is the key to lookup which certificate to use. So, that’s what kids are; they’re the ids of signing certificates & algorithms.
So, where does the performance gain come in?
The performance gain for these publicly available certificates is that they can be cached on your application servers. If your application is going to use Google OAuth for authentication, and use JWT tokens to pass the user information around, then you can verify the token signatures using cached certificates. This puts all the authentication overhead on your application server and not in a synchronous callback to an Authentication Service.
But, there is a small performance penalty in the first call to retrieve the JWKS.
What’s the first call performance penalty look like?
Not much, about 500 ms. But, here’s what it looks like with an actual example.
First call that includes the JWT Token:
- It reaches out to https://accounts.google.com/.well-known/openid-configuration which has configuration information
- The configuration information indicates where to the get the “kids”: https://www.googleapis.com/oauth2/v3/certs
- It downloads the JWKS and caches them
- Then it performs validation against the JWT token (my token was expired in all of the screenshots, this is why there are “bugs” indicated)
- Processing time: 582 ms
- Processing time overhead for JWT: about 500 ms (in the list on the left side, the request just before it was the same request with no JWT token, it took about 99 ms)
(info was larger than one screenshot could capture)
Second call with JWT:
- The caching worked as expected and the calls to google didn’t occur.
- Processing Time: 102 ms
- So, the 500 ms overhead of the google calls don’t happen when the caching is working.
(info was larger than one screenshot could capture)
Test Setup:
- The first call is the application load time. This also included an Entity Framework first load penalty when it called a database to verify if I have permissions to view the requested record.
- Processing Time: 4851 ms
- This call did not include the JWT.
- The second call was to baseline the call without a JWT.
- Processing Time: 96 ms
- The third call was to verify the baseline without a JWT.
- Processing Time: 99 ms
Wrap Up …
So, it isn’t much of a performance gain. But, its enough to make caching the value and keeping all the authentication logic on your application server worth while.
(Ohh … and Stackify Prefix is pretty awesome! If you haven’t tried it, you should: https://stackify.com/prefix-download/)
0 comments:
Post a Comment