Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

performance issue: createVerifier called for every request and cache not used #358

Closed
2 tasks done
SeanReece opened this issue Jan 2, 2025 · 7 comments · Fixed by #360
Closed
2 tasks done

performance issue: createVerifier called for every request and cache not used #358

SeanReece opened this issue Jan 2, 2025 · 7 comments · Fixed by #360

Comments

@SeanReece
Copy link
Contributor

SeanReece commented Jan 2, 2025

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the bug has not already been reported

Fastify version

5.1.0

Plugin version

9.0.2

Node.js version

22.12.0

Operating system

macOS

Operating system version (i.e. 20.04, 11.3, 10)

15.1.1

Description

createVerifier seems to be called to create a new fast-jwt verifier instance for every incoming request. Not only is this expensive in itself, but the fast-jwt cache is not used even if enabled using the options.

Testing on my local machine. Macbook M1 Pro. 10,000 requests. verify/createVerifier takes ~2.6s which is %5-20% of total response time depending on the endpoint.

Note: The endpoint I'm using here to test is quite complex, so simpler endpoints would see a bigger impact.

image

When I monkeyPatch @fastify/jwt to reuse the verifier per key (we only use a single publicKey from a JWKS endpoint), verify/createVerifier only take 400ms.

image

Here's an example of our @fastify/jwt options

server.register(fastifyJwt, {
    decode: { complete: true },
    verify: {
      cache: true, // Performance is the same / better with this set to false as cache is bypassed
      algorithms: ['RS256'],
      allowedAud: verifyAud,
      allowedIss: verifyIss,
      requiredClaims,
    },
    secret: async (request, token) => {
      const {
        header: { kid },
      } = token
      const key = await jwksClient.getSigningKey(kid)
      return key.getPublicKey()
    },
    formatUser,
  })

Link to code that reproduces the bug

No response

Expected Behavior

I expect createVerifier to only be called once per publicKey.

@kibertoad
Copy link
Member

thank you for your report! can you send a fix PR with test?

@SeanReece
Copy link
Contributor Author

@kibertoad Absolutely, just wanted to confirm I wasn't doing something incorrectly before I fired off a PR.

I wanted your opinion on the best way to cache the verifier instances. We could do something like a simple Map, but I think that could lead to memory leaks in the case where the key is rotating frequently.

fast-jwt uses mnemonist for its LRU cache, so we could use that to be consistent, but IMO that seems like overkill when only using the LRU-cache. OR go with the tried-and-true lru-cache. WDYT?

@kibertoad
Copy link
Member

kibertoad commented Jan 3, 2025

@SeanReece Bringing two different cache libraries doesn't feel nice, so staying with the same solution as fast-jwt seems more reasonable.
That said, do we need multiple fast-jwt instances or the TTL? can't we just store a singleton instance somewhere indefinitely?
right, you want to support the case of multiple keys. do these ever change dynamically? if not, just a plain in-memory Map would be better than a full-blown cache.

@SeanReece
Copy link
Contributor Author

@kibertoad I think you're probably right about public keys not changing dynamically (at least not frequently). Maybe I'm just overthinking the solution here :) I think a simple Map will work (maybe with a max size to avoid any possible memory issues)

@kibertoad
Copy link
Member

Do you expect anyone to have thousands of keys?

@SeanReece
Copy link
Contributor Author

Not at any one time, but imagine some organization has a policy to rotate their signing keys every hour, that would mean we would have 1000 keys/verifiers every 41days, which is technically a slow memory leak. This is likely a very rare edge case but thought I'd handle it anyways. Would be pretty simple I think. Something like this:

const maxCacheSize = 500
const cache = new Map()
...
let verifier = cache.get(publicKey)
if (!verifier) {
  verifier = createVerifier(verifierOptions)
  cache.set(publicKey, verifier)
  if (cache.size > maxCacheSize) {
    cache.delete(cache.keys().next().value) // Remove the oldest cached verifier
  }
}
verifyResult = verifier(token)

@kibertoad
Copy link
Member

sure, that does make sense

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants