How to do it...

Let's create a folder called hello-server, initialize a package.json, and install Express and Jade:

$ mkdir hello-server
$ cd hello-server
$ npm init -y
$ npm install --save express jade

Now we'll create our server.js file:

const express = require('express')
const path = require('path')
const app = express()

app.set('views', path.join(__dirname, 'views'));
app.set('view engine', 'jade');

app.get('/hello', (req, res) => {
res.render('hello', { title: 'Express' });
})

app.listen(3000)

Next, we'll create the views folder:

$ mkdir views  

Now we create a file in views/hello.jade, with the following content:

doctype html
html
head
title= title
link(rel='stylesheet', href='/stylesheets/style.css')
body
h1= title

OK, now we're ready to profile the server and generate a flamegraph.

Instead of starting our server with the node binary, we use the globally installed 0x executable.

We start our server with the following command:

$ 0x server.js 

Now we can use the autocannon benchmarking tool to generate some server activity.

Autocannon
We explored autocannon in the previous recipe, Benchmarking HTTP.

In another terminal window we use autocannon to generate load:

$ autocannon -c 100 http://localhost:3000/hello
Running 10s test @ http://localhost:3000/hello
100 connections with 1 pipelining factor

Stat Avg Stdev Max
Latency (ms) 259.62 122.24 1267
Req/Sec 380.37 104.36 448
Bytes/Sec 131.4 kB 35.84 kB 155.65 kB

40k requests in 10s, 1.45 MB read

When the benchmark finishes, we hit Ctrl + C in the server terminal. This will cause 0x to begin converting captured stacks into a flamegraph.

When the flamegraph has been generated a long URL will be printed to the terminal:

$ 0x server.js
file://path/to/profile-86501/flamegraph.html

The 0x tool has created a folder named profile-XXXX, where XXXX is the PID of the server process.

If we open the flamegraph.html file with Google Chrome we'll be presented with some controls, and a flamegraph resembling the following:

A flamegraph representing our /hello route under load
0x Theme
By default, 0x presents flamegraphs with a black background, the flamegraph displayed here has a white background (for practical purposes). We can hit the Theme button (bottom left) to switch between black and white 0x themes.
The Optimization Workflow
We should know now how to conduct step 2 of the Optimization Workflow outlined in the introduction to this chapter: We launch the application with 0x to generate a flamegraph.

Functions that may be bottlenecks are displayed in darker shades of orange and red.

Hot spots at the bottom of the chart are usually less relevant to application and module developers, since they tend to relate to the inner workings of Node core. So if we ignore those, we can see that most of the hot areas appear within the two macro flames in the middle of the chart. A quick study of these show that many of the same functions appear within each--which means overall both stacks represent very similar logical paths.

Graphical Reproducibility
We may find that our particular flamegraph doesn't exactly match the one included here. This is because of the non-deterministic nature of the profiling process. Furthermore, the text on each frame will almost certainly differ, since it's based on the location of files on your system. Overall, however, the general meaning of the flamegraph should be the same.

The right-hand stack has a cluster of hot stacks some way up the main stack in the horizontal center of other diverging stacks.

A hot function

Let's click near the illustrated frame (or the equivalent identified stack if our current flamegraph is slightly different). Upon clicking the frame, 0x allows us to delve deeper by unfolding the parent and child stacks to fill the screen, like so:

Unfolded stacks

We should be able to see a pattern of darker orange stack frames, in each case the function is the same function appearing on line 48 of a walk.js file in one of our sub-dependencies.

We have located our bottleneck!

The Optimization Workflow
We've now covered step 3 of the Optimization Workflow: find the bottleneck. We've used the flamegraph structure and color coding to quickly understand where the slowest part of our server is.
What's the cause?
Figured out what the root cause is? Check the There's more section of this recipe to find out!
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset