Speeding Up Your Website Using Fastify and Redis Cache

WBOY
Release: 2024-08-26 21:46:32
Original
254 people have browsed it

Speeding Up Your Website Using Fastify and Redis Cache

Less than 24 hours ago, I wrote a post about how to speed up your website using Cloudflare cache. However, I've since moved most of the logic to a Fastify middleware using Redis. Here is why and how you can do it yourself.

Cloudflare Cache Issues

I ran into two issues with Cloudflare cache:

  • Page navigation broke after enabling caching of the responses. I raised an issue about this in the Remix forum a while back, but as of writing this, it is still unresolved. It is not clear why caching the response is causing the page navigation to break, but it only happens when the response is cached by Cloudflare.
  • I could not get Cloudflare to perform Serve Stale Content While Revalidating as described in the original post. Looks like it is not a feature that is available.

There were a few other issues that I ran into (like not being able to purge the cache using pattern matching), but those were not critical to my use case.

Therefore, I decided to move the logic to a Fastify middleware using Redis.

[!NOTE]
I left Cloudflare cache for image caching. In this case, Cloudflare cache effectively functions as a CDN.

Fastify Middleware

What follows is an annotated version of the middleware that I wrote to cache responses using Fastify.

const isCacheableRequest = (request: FastifyRequest): boolean => { // Do not attempt to use cache for authenticated visitors. if (request.visitor?.userAccount) { return false; } if (request.method !== 'GET') { return false; } // We only want to cache responses under /supplements/. if (!request.url.includes('/supplements/')) { return false; } // We provide a mechanism to bypass the cache. // This is necessary for implementing the "Serve Stale Content While Revalidating" feature. if (request.headers['cache-control'] === 'no-cache') { return false; } return true; }; const isCacheableResponse = (reply: FastifyReply): boolean => { if (reply.statusCode !== 200) { return false; } // We don't want to cache responses that are served from the cache. if (reply.getHeader('x-pillser-cache') === 'HIT') { return false; } // We only want to cache responses that are HTML. if (!reply.getHeader('content-type')?.toString().includes('text/html')) { return false; } return true; }; const generateRequestCacheKey = (request: FastifyRequest): string => { // We need to namespace the cache key to allow an easy purging of all the cache entries. return 'request:' + generateHash({ algorithm: 'sha256', buffer: stringifyJson({ method: request.method, url: request.url, // This is used to cache viewport specific responses. viewportWidth: request.viewportWidth, }), encoding: 'hex', }); }; type CachedResponse = { body: string; headers: Record; statusCode: number; }; const refreshRequestCache = async (request: FastifyRequest) => { await got({ headers: { 'cache-control': 'no-cache', 'sec-ch-viewport-width': String(request.viewportWidth), 'user-agent': request.headers['user-agent'], }, method: 'GET', url: pathToAbsoluteUrl(request.originalUrl), }); }; app.addHook('onRequest', async (request, reply) => { if (!isCacheableRequest(request)) { return; } const cachedResponse = await redis.get(generateRequestCacheKey(request)); if (!cachedResponse) { return; } reply.header('x-pillser-cache', 'HIT'); const response: CachedResponse = parseJson(cachedResponse); reply.status(response.statusCode); reply.headers(response.headers); reply.send(response.body); reply.hijack(); setImmediate(() => { // After the response is sent, we send a request to refresh the cache in the background. // This effectively serves stale content while revalidating. // Therefore, this cache does not reduce the number of requests to the origin; // The goal is to reduce the response time for the user. refreshRequestCache(request); }); }); const readableToString = (readable: Readable): Promise => { const chunks: Uint8Array[] = []; return new Promise((resolve, reject) => { readable.on('data', (chunk) => chunks.push(Buffer.from(chunk))); readable.on('error', (err) => reject(err)); readable.on('end', () => resolve(Buffer.concat(chunks).toString('utf8'))); }); }; app.addHook('onSend', async (request, reply, payload) => { if (reply.hasHeader('x-pillser-cache')) { return payload; } if (!isCacheableRequest(request) || !isCacheableResponse(reply) || !(payload instanceof Readable)) { // Indicate that the response is not cacheable. reply.header('x-pillser-cache', 'DYNAMIC'); return payload; } const content = await readableToString(payload); const headers = omit(reply.getHeaders(), [ 'content-length', 'set-cookie', 'x-pillser-cache', ]) as Record; reply.header('x-pillser-cache', 'MISS'); await redis.setex( generateRequestCacheKey(request), getDuration('1 day', 'seconds'), stringifyJson({ body: content, headers, statusCode: reply.statusCode, } satisfies CachedResponse), ); return content; });
Copy after login

The comments walk through the code, but here are some key points:

  • Caching Criteria:
    • Requests:
    • Do not cache responses for authenticated users.
    • Only cache GET requests.
    • Only cache responses for URLs that include "/supplements/".
    • Bypass cache if the request header contains cache-control: no-cache.
    • Responses:
    • Only cache successful responses (statusCode is 200).
    • Do not cache responses already served from the cache (x-pillser-cache: HIT).
    • Only cache responses with content-type: text/html.
  • Cache Key Generation:
    • Use SHA-256 hash of a JSON representation containing request method, URL, and viewport width.
    • Prefix the cache key with 'request:' for easy namespacing and purging.
  • Request Handling:
    • Hook into the onRequest lifecycle to check if a request has a cached response.
    • Serve the cached response if available, marking it with x-pillser-cache: HIT.
    • Start a background task to refresh the cache after sending a cached response, implementing "Serve Stale Content While Revalidating".
  • Response Handling:
    • Hook into the onSend lifecycle to process and cache responses.
    • Convert readable streams to string for simpler caching.
    • Exclude specific headers (content-length, set-cookie, x-pillser-cache) from the cache.
    • Mark non-cacheable responses as x-pillser-cache: DYNAMIC.
    • Cache responses with a TTL (Time To Live) of one day, marking new entries with x-pillser-cache: MISS.

Results

I ran latency tests from several locations and captured the slowest response time for each URL. The results are below:

URL Country Origin Response Time Cloudflare Cached Response Time Fastify Cached Response Time
https://pillser.com/vitamins/vitamin-b1 us-west1 240ms 16ms 40ms
https://pillser.com/vitamins/vitamin-b1 europe-west3 320ms 10ms 110ms
https://pillser.com/vitamins/vitamin-b1 australia-southeast1 362ms 16ms 192ms
https://pillser.com/supplements/vitamin-b1-3254 us-west1 280ms 10ms 38ms
https://pillser.com/supplements/vitamin-b1-3254 europe-west3 340ms 12ms 141ms
https://pillser.com/supplements/vitamin-b1-3254 australia-southeast1 362ms 14ms 183ms

Compared to Cloudflare cache, Fastify cache is slower. That's because the cached content is still served from the origin, whereas Cloudflare cache is served from regional edge locations. However, I found that these response times are plenty to achieving good user experience.

The above is the detailed content of Speeding Up Your Website Using Fastify and Redis Cache. For more information, please follow other related articles on the PHP Chinese website!

source:dev.to
Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template
About us Disclaimer Sitemap
php.cn:Public welfare online PHP training,Help PHP learners grow quickly!