Piping streams in production

The pipe method is one of the most well-known features of streams. It allows us to compose advanced streaming pipelines as a single line of code.

As a part of Node core it can be useful for cases where process uptime isn't important (such as CLI tools).

Unfortunately, however, it lacks a very important feature: error handling.

If one of the streams in a pipeline composed with pipe fails, the pipeline is simply unpiped. It is up to us to detect the error and then afterwards destroy the remaining streams so they do not leak any resources. This can easily lead to memory leaks.

Let's consider the following example:

const http = require('http') 
const fs = require('fs')

const server = http.createServer((req, res) => {
fs.createReadStream('big.file').pipe(res)
})

server.listen(8080)

A simple, straightforward, HTTP server that serves a big file to its users.

Since this server is using pipe to send back the file, there is a big chance that this server will produce memory and file descriptor leaks while running.

If the HTTP response were to close before the file has been fully streamed to the user (for instance, when the user closes their browser), we will leak a file descriptor and a piece of memory used by the file stream. The file stream stays in memory because it's never closed.

We have to handle error and close events, and destroy other streams in the pipeline. This adds a lot of boilerplate, and can be difficult to cover in all cases.

In this recipe, we're going to explore the pump module, which is built specifically to solve this problem.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset