How to stream files from Next.js Route Handlers

January 31, 2024 (3 months ago)

Well, I've tried hard to write a catchy introduction, explaining why files are so important in a developer's life and that you should definitely read this article.

But the thing is that unless you just got out of a time machine, I am pretty sure you already now what a file is, that files are important for computers, and that files can be big.

So let's skip it the babbling and dive directly into the question that bothers you : how to stream files from Next.js Route Handlers.

You will learn:

  • How to serve big files with Next.js without bloating your RAM, using API Routes and Route Handlers
  • What are streams in Node.js and the browser
  • The philosophical differences between API Routes and Route Handlers

This article is a long read, it will taste better with a cup of tea or coffee!

Why using an API endpoint to serve files, rather than the "public" folder?

In Node.js applications, we traditionally use a "public" folder to serve files, like images, fonts, vendor scripts or the latest version of your resume in PDF.

In Express.js, the public folder is setup in a single line of code (documented here):

app.use(express.static('public'))

In Next.js, this is done in 0 lines of code, because it works out of the box.

However this works only for public, generic files. Next.js calls them "static assets". Yet, some files are neither public nor generic or static.

Some counter-examples that don't fit the public folder:

  • Personalized files: a generic url "/user/profile-picture" may serve a different image depending on the currently logged user
  • Private files: you may want to check some permissions before serving "extremely_confidential_report.docx"
  • Generated files: the State of JavaScript administration app can generate CSV exports of the responses to the survey. This means we generate the file on the fly depending on the selected survey when we receive an export request.

For these files, you need to craft an API endpoint that returns the content of the file. Let's do that.

How to serve files in Next.js with Route Handlers, the easy way

The easy way to serve files is to load them in memory and then return the result. Here is the full-code that you may copy-paste to create a new route handler.

// File "app/api/serve-file/route.ts" 
// (This file is a Next.js 13+ Route Handler)

// fs/promises gives a nice async/await syntax
import fsPromises from "fs/promises" 

// "Request" type doesn't have to be imported, 
// it's part of the web platform API and available in Node.js too
export const GET = (req: Request) => {
  // I suppose the file exists to simplify the code
  const filePath = "/tmp/some-file.zip"
  const stats = await fsPromises.stat(filePath);
  // read your file
  const fileContent = await fsPromises.readFile(filePath)
  // and serve it by returning a response
  return new Response(
      fileContent, 
      {
        status: 200,
        headers: new Headers({
          // this optional header triggers a download in the browser
          "content-disposition": `attachment; filename=${
            path.basename(filePath)
            }`,
          "content-type": "application/zip",
          "content-length": stats.size + "",
      })
    })
}

Sometimes, instead of Request and Response, you'll see NextResponse and NextRequest. The Next versions are just wrappers around native Response and Request objects. They provide a few helpers like the parsed URL in the nextUrl field.

This is an intuitive solution, totally fine if you operate at a small scale.

But the problem is that you have to load the whole file in memory before returning it.

This is not good if the file is very big. Even if the file is small, if you have a high number of requests, this will also bloat your RAM.

// this "data" variable comes at a cost for big files!
// it copies the whole file in the server memory
const data = await fsPromises.readFile(filePath)

The solution to this limitation consists in streaming the file. It will be read in chunks that are immediately sent to the end user, which reduces the pressure on your server's RAM.

How to stream big files, the old way with API routes (Next.js 9+)

Let's start with the traditional solution, using Next.js API Routes from the "Page Router" (aka the "pages" folder). Here is how you would stream files in an API Route, without copying them in memory.

This is similar to what you could find in a typical Express application.

// File "pages/api/serve-file.ts" 
// <- It's a good ol' Next.js 9+ API route
import { NextApiRequest, NextApiResponse } from 'next'

export default async function serveFile(
  req: NextApiRequest, res: NextApiResponse){
  const filePath = "/tmp/some-file.zip";
  const stats = await fsPromises.stat(filePath);
    res.writeHead(200, {
      "Content-Disposition": 
      `attachment; filename=${path.basename(
        filePath
      )}`,
      "Content-Type": "application/zip",
      "Content-Length": stats.size,
    });
    await new Promise(function (resolve) {
      const nodeStream = fs.createReadStream(filePath);
      nodeStream.pipe(res);
      nodeStream.on("end", resolve);
    });
}

NextApiRequest/NextApiResponse are for API Routes (Next 9+), while NextRequest and NextResponse are for Route Handlers (Next 13+). They are totally different things, and not directly compatible! We'll explain this difference in the next section.

Moving to route handlers: it's not that easy

Now imagine that we want to move to route handlers, the new way of creating API endpoints brought by Next.js App Router.

Sadly, we can't copy-paste the code we wrote for the API Route into a Route Handler.

The problem is that the stream expected by Route Handlers, or more precisely by the Response constructor, is a "web platform stream". It is totally different from the ReadStream structure we use in API Routes, which is defined by Node.js fs package!

You cannot just pass the result of fs.createReadStream to a Response object. This means we need an additional conversion step.

// create a "ReadStream"  
const nodeStream = fs.createReadStream(filePath);
  // and pass it to a Response, which accepts a "ReadStream" object
  return new Response(
      // ❌ nope, that won't work! 
      // This is not the right type of "stream"!
      nodeStream, 

This is unsettling, a stream is a stream, so why does this fail?

Route Handlers and API Routes are built on different paradigms

The new Route Handlers differ quite a lot from traditional API Routes.

  • Next.js API Routes, which exist since version 9 and will be kept around likely forever, are based on Node.js. They are cousins to Express, that's why you can easily use Express middlewares and libraries, like Passport for authentication, in Next.js.

  • Route Handlers are newcomers brought by version 13. The main difference with API routes is that they are not strictly tied to Node.js. They will still use Node as the default runtime, but you can optionally use a lighter alternative called the "Edge Runtime". More broadly, this approach favors using JavaScript data structure that should exist in any runtime, be it Node.js, the Edge runtime, Deno, Bun or the runtime you are maybe building in your garage 'cause we need more.

That's why Route Handlers use generic Response and Request objects, that were originally designed for the browser fetch API.

Also, handlers use a different approach for "responding" to the user HTTP request. In an API route, you call res.send, or you pipe a stream to the res object. You don't need to return anything. In a Route Handler, you must return a Response object. Then Next.js takes care of turning that into an HTTP response.

Here is how the code will look like in each paradigm:

// API route style for basic JSON data
res.send({foo: "bar"})

// API route style for streaming files: 
// piping will send the HTTP response directly to the client
const nodeStream = fs.createReadStream(filePath);
nodeStream.pipe(res);

// Route Handler style for streaming: you return a Response object
// and Next takes care of actually send an HTTP response
// streamFile is the function we want to build in this article!
const webPlatformStream = streamFile(filePath)
return Response(webPlatformStream) // the return is important

I'd like to mention that you don't have to use Route Handlers at all if you don't want to!

API Routes are fine and will be fine in any forseeable future. Features brought by Next 13+ App Router are great, but they are also sharp and hard to use. It's totally safe to wait a year or even more before even getting started with them.

Now we understand why we have 2 different approaches. Route Handlers aim to be more generic than API routes, that are specific to Node.js.

Still, if we do want a route handler, this means we need to learn how to convert Node-specific structures into generic web platform structures, and it requires a bit of work. Let's see how to get things done for streaming files.

From Node.js streams to web platform streams via generators

So, our problem is that we need to convert a Node.js fs.ReadStream to a web platform ReadableStream. We also need to take the difference of syntaxes between both worlds.

When I first digged this issue, there was no documented solution, so I had to craft one. More precisely, I sewed various pieces of code found on Stack Overflow and Google (remember Lurch from the Adams Family? he's my fav). You can read the gory details on Stack Overflow.

Generators are a generic data structure in JavaScript, so it should be possible to use them as an intermediate language between Node and the web platform.

I will use the terms generator and iterator interchangeably. Formally, you use a generator function to produce an iterator. More details in Mozilla Developer documentation.

So here is the pseudocode we want to implement:

  1. Convert fs.ReadStream (given by Node.js) to a JavaScript iterator
  2. Convert JavaScript iterator toReadableStream (wanted by Next.js route handlers)

If that sounds tricky, this is because it is tricky. We are trying to establish a link between 2 completely different worlds, Node.js world and the web platform world. I mean, it's literally involving generators!

Step 1: from fs.ReadStream to iterator

Here is a generator function that returns an iterator over an fs.ReadStream:

// Syntax taken from 
// https://github.com/MattMorgis/async-stream-generator
// itself taken from 
// https://nextjs.org/docs/app/building-your-application/routing/router-handlers#streaming
// probably itself taken from 
// https://nodejs.org/api/stream.html
// Always quote your sources!
async function* nodeStreamToIterator(stream: fs.ReadStream) {
    for await (const chunk of stream) {
        yield new Uint8Array(chunk);
    }
}

The const chunk of stream part read the file one chunk at a time. The await part is just there because reading a chunk of a file is an asynchronous operation, similarly to reading the whole file. This is possible because Node.js streams are asynchronous iterators, so this syntax works on them.

You may not be used to iterators, but basically think of them as arrays of arbitrary size (even infinite) that you can read one value by one value. That's why you "yield" one chunk of data, instead of returning the whole file.

The new Uint8Array call is needed so your data get the proper encoding. I am serving binary files here, for text file you may want to use TextEncoder instead. See this GitHub ticket for more info.

In Node.js, streams are meant to be consumed primarily via events. According to Node.js documentation, setting a "data" event handler on a stream will make it flowing. You can also use the pipe method as demonstrate earlier. This explains why the iterator version is less common in the wild, yet it's better suited for our goal.

Step 2: from iterator to ReadableStream

Now that we have an iterator, Mozilla documentation shows us how to convert it into a web platform ReadableStream:

// Provided by Mozilla Developer documentation
// https://developer.mozilla.org/en-US/docs/Web/API/ReadableStream#convert_async_iterator_to_stream
// Please say thanks to Mozilla !!
function iteratorToStream(iterator) {
  return new ReadableStream({
    async pull(controller) {
      const { value, done } = await iterator.next();
      if (done) {
        controller.close();
      } else {
        controller.enqueue(value);
      }
    },
  });
}

Plugging the functions together

So, we can convert a Node.js stream into an iterator, and an iterator into a web platform ReadableStream.

We are ready to craft our streamFile function, usable in Route Handlers!

// This finally turns a file path into a ReadableStream!
export function streamFile(path: string): ReadableStream {
    const nodeStream = fs.createReadStream(path);
    const data: ReadableStream = iteratorToStream(
        nodeStreamToIterator(
            nodeStream
        )
    )
    return data
}

You may notice that createReadStream is a Node.js function, so this code works only in Node.js despite returning a generic ReadableStream. It's not compatible with the Edge runtime.

That's completely normal. Route Handlers are meant to be compatible with different runtimes, however you still have to write runtime specific code anytime you actually want to do a non-trivial operation. If you use Deno, you will have Deno specific code.

Ideally, JavaScript would provide a generic function for opening files, that would return a  ReadableStream out of the box. However I am not aware of such an API. Ping me on Twitter if you have more info than I do!

Serving the file, using a Route Handler

We've been so far into advanced JavaScript that we almost forgot the initial goal: we just wanted to stream a file in a route handler, to let the end user download something!

Obtaining a ReadableStream was the hardest part, now we just have to pass it to the Response object:

 // File "app/api/serve-file/route.ts"
 // This is the final code, 
 // reusing the "streamFile" helper
 // that we have crafted above
 const stats = await fsPromises.stat(filePath);
 const stream: ReadableStream = streamFile(filePath)
 return new Response(stream, {
   status: 200,
   headers: new Headers({
    "content-disposition": 
    `attachment; filename=${path.basename(
       filePath
     )}`,
    "content-type": "application/zip",
    "content-length": stats.size + "",
 })

You can see a real-file example usage in the State of JavaScript codebase, to download a CSV+JSON zip out of a Mongo database.

Recap and conclusion

Let's try to recap what we have learnt in this article:

  • Serving files via an API endpoint is necessary for custom, private or generated files
  • Using a stream is necessary to avoid bloating your server RAM, it avoids loading the whole file in memory and instead returns it as chunks
  • Next.js API routes are using Node.js syntax and data structure, it's easy to stream a file
  • Next.js Route handlers strive to rely on more generic JavaScript built-in structures, namely the Request and Response objects from the fetch API
  • We had to craft an helper to convert the stream produced by fs.createReadStream(filePath) , which is specific to Node.js, into a generic ReadableStream
  • Finally, we can pass the converted stream to the Response object. Tada!

Thanks @rohithandique for the feedback on Next.js GitHub and the many people whose code I had to copy paste and merge to produce this beauty.

Bonus

As a final experiment, I've also tried to convert Request to NextAPIRequest, in order to be able to reuse Express/Connect logic in route handlers, which would enable using Passport namely. If you are interested, you can contribute to this GitHub issue on the next-connect project.

If you want to learn more about streams in Next.js, I recommend this article from Mohammad (Minimalist Web Dev). It explains how stream are used by React Server Components to deliver HTML progressively.

Loved this article?

You will probably also like my Next.js course,

"Blazing Fast Next.js with React Server Components"

RSCs, Server Actions, Partial Prerendering, revalidation... everything's in there to fully grasp Next.js server-centric optimizations.