Caching next.js. Gift or Curse

In version 13, the next.js team introduced a new approach to application design – the so-called App Router. In version 14 it was made stable and the main one for new applications.

App Router significantly expands the functionality of next.js – partial pre-rendering, templates, parallel and intercepted routes, server components and much more. However, even despite all these improvements, not everyone decided to switch to App Router. And there are reasons for this.

I briefly described the advantages and problems of the new router in the article “Next.js App Router. Experience of use. The path to the future or a wrong turn.” Further, we will not talk about new abstractions or their features. In fact, the key and most controversial change is caching. This article will tell you what, why and how the most popular front-end framework, Next.js, caches.

What does next.js cache?

You can find an excellent one on the next.js website documentation of the caching process. First, a quick summary of the main points from the article.

Any request in next.js called via fetch will be memoized and cached. The same thing will happen with pages and with the cache function. How this works under the hood will be discussed in the following sections. The general page build process works as follows:

Caching process in next.js.  Source: next.js documentation

That is: the user visits the page, a request is sent for the route to the server, the server begins to render the route, sending the necessary requests. Then all this is executed and cached.

In addition to the cache, the scheme also includes memoization. It is needed for repeated requests – so that they are not sent several times, but are subscribed to the first one.

Memoization in next.js.  Source: next.js documentation

Caching on the server occurs using the so-called Data Cache. You can delete data from it by calling the functions revalidatePath And revalidateTag. The first will update the cache for the page, the second for the tag specified during requests.

Cache revalidation in next.js.  Source: next.js documentation

Data is also cached on the client side – inside the client router.

Client caching in next.js.  Source: next.js documentation

From what is not described in the article – next.js also caches rewrites and redirects. That is, if the user once on the server was redirected from the page / on /login – now it will continue to redirect there. This will be cached in the client router until the client cache is cleared.

You can delete the cache on the client using router.refresh or by calling revalidatePath And revalidateTag in server actions.

'use server'

import { revalidateTag } from 'next/cache'

export default async function submit() {
  await addPost()
  revalidateTag('posts')
}

Why caching in next.js?

Fetch from next.js is a wrapper around native node.js fetch. Inside the wrapper, a connection with the so-called Data Cache is configured. This is done so that each request can be processed as described in the diagrams above. The next.js team is most often criticized by the community for this substitution of the native API.

Later, next.js added the ability to disable request caching using the option cache: "no-store". But even with this option, it will continue to be memoized. As a result, one of the key APIs for development is no longer controlled by the developer.

However, there were reasons for this step. And it is unlikely that these reasons were originally optimization. For optimization, it would be enough to create a new function for requests – a separate API, of which there are hundreds in next.js.

I took a similar path when developing the package next-translation (what I wrote about in the previous article). Then an interesting problem appeared – too many requests were sent to the server (not called via fetch). Understanding the reasons and reading the source code of next.js, it became clear that the application is now assembled into several independent threads. It's strange why they didn't talk about this in recent releases. Each thread lives as an independent process, and as a result, it was not possible to do normal caching for the entire application inside the package.

The next.js team faced the same problem – every integration, every package, every user now began to send several times more requests, and the previously configured caching systems stopped working correctly. And the solution is to remake fetch and hide this feature under the hood.

How does Data Cache work?

The downloaded or generated data is saved in the so-called cacheHandler. Out of the box, next.js has 2 cacheHandler options – FileSystem and Fetch. This cacheHandler will be used for both request caching and page caching.

FileSystem is used by default, saves data to the file system, additionally memoizing in-memory. FileSystem does its job well, but it has one drawback – it works as part of an application. It follows from this that if an application is created in several replicas, each of them will have an independent cacheHandler.

This problem is especially noticeable when the application is running in ISR mode. You need to get into each replica and revalidate the cache in each of them. At the same time, we will check that they will load the same data. Also, if 2 replicas work with the same folder, conflicts may arise in the file system during recording.

This is probably why Fetch can be found in the framework code. It saves the cache to a remote server. However, this cacheHandler is only used when publishing the application to Vercel, as it stores the data on Vercel servers.

As a result, the out-of-the-box solution does not cover all needs – FileSystem is not suitable if there are several replicas, and Fetch is not suitable if the application is not deployed to Vercel. An important feature is that next.js allows you to write your own cacheHandler. To do this, you need to pass to the application configuration the path to the file with the class (CacheHandler), which will describe the get, set and revalidateTag methods:

// cache-handler.js
module.exports = class CacheHandler {
  constructor(options) {
    this.options = options
  }

  async get(key) {
    // ...
  }

  async set(key, data, ctx) {
    // ...
  }

  async revalidateTag(tag) {
    // ...
  }
}

And connect it to the application configuration:

module.exports = {
  cacheHandler: require.resolve('./cache-handler.js'),
  cacheMaxMemorySize: 0, // disable default in-memory caching
}

One of these cache-handlers is cache-handler-rediswhich was referenced by the next.js team in the latest release.

Key points

Next.js caches most of the processes.

Caching occurs in several stages – caching of transitions and pages in the client router, memoization of the request, caching of requests and pages on the server.

Quite often applications run in multiple replicas. Replicas need a shared cache, this is especially acute when the application is running in ISR mode.

The application itself is assembled into several threads that do not have access to each other.

cacheHandler is responsible for caching. Next.js has two options out of the box – filesystem-based and remote-server-based, but the latter is only available inside Vercel.

You can write your own cacheHandler.

Improved caching

Let's return to the next-translation package. To solve the problem of unnecessary requests, I came to an interesting solution – to set up an additional server and process requests passing through it – as a result, all requests go from one place, which means that general caching can be configured in it. This is a principle similar to FetchCacheHandler and the approach in Vercel in general – when during assembly the data is cached on the Vercel server, and since the server is nearby it works quickly.

However, caching is too much of a responsibility for a translation library. The next task was to rework the caching logic to combine the next.js API, libraries and solve common problems. As a result, another library was created – next-impl-cache-adapter.

Caching management

As already mentioned, for a shared cache between instances (replicas, copies), the cache must be located separately from each application instance. next-impl-cache-adapter solves this by creating a separate service.

This service is a server in which the required cacheHandler runs. Each application instance will process requests through this server. In this case, the server does not need to be restarted with each build. Outdated data will be deleted automatically when a new version of the application is launched.

Server code:

// @ts-check
const createServer = require('next-impl-cache-adapter/src/create-server');
const CacheHandler = require('next-impl-cache-in-memory');

const server = createServer(new CacheHandler({}));
server.listen('4000', () => {
  console.log('Server is running at <http://localhost:4000>');
});

In this example, next-impl-cache-in-memory is passed to the server – this is a basic cacheHandler that stores data in-memory.

In the application itself, a special adapter for working with the cache is configured:

// cache-handler.js
// @ts-check
const AppAdapter = require('next-impl-cache-adapter');
const CacheHandler = require('next-impl-cache-in-memory');

class CustomCacheHandler extends AppAdapter {
  /** @param {any} options */
  constructor(options) {
    super({
      CacheHandler,
      buildId: process.env.BUILD_ID || 'base_id',
      cacheUrl: 'http://localhost:4000',
      cacheMode: 'remote',
      options,
    })
  }
}

module.exports = CustomCacheHandler;

The created adapter is connected in the next.js configuration:

// next.config.js

module.exports = {
  cacheHandler: require.resolve('./cache-handler.js'),
  cacheMaxMemorySize: 0, // disable default in-memory caching
}

The package supports three caching options: local, remote And isomorphic.

local

Standard solution. The cache is processed next to the application. Convenient to use in development mode and at stages where the application is running in one instance.

remote

The entire cache will be written and read to the created remote server. Convenient to use for applications running on multiple replicas.

isomorphic

The cache works next to the application, but additionally stores data on a remote server. Convenient to use during build, preparing the cache for the moment the application instances are launched, but without wasting resources on loading the cache from a remote server.

The cacheHandler can be any cacheHandler supported by next.js. Conversely, cacheHandlers from the package can be connected directly to next.js.

conclusions

App Router introduced many very useful updates, but lost in convenience, predictability and versatility. Primarily due to caching. After all, this is a task for which there is not and cannot be a universal solution. The ability to disable caching for a request and write your own cacheHandler solves most problems. However, memoization and caching in the client router remain beyond control.

The next.js team itself is in no hurry to develop solutions for specific tasks. This is largely why, since the release of the stable App Router, I have continued to work on implementing packages that solve the next.js problem. Along the way, talking about them in articles.

Let's make the web not only faster, but also clearer.

Links

next-impl-cache — solutions for setting up caching in next.js.

next-impl-getters — implementation of server getters and contexts in React Server Components without switching to SSR.

next-impl-config – adding configuration support for each possible next.js environment (build, server, client and edge).

next-classnames-minifier — compression of classes to symbols (.a, .b, …, .a1).

next-translation — i18n library, developed taking into account server components and maximum optimization.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *