Node API with Fastify in 2023

I always strive for perfection, even if that’s an unreachable utoptia. I justify this irrational behavior by stopping at a certain point, saying “fair enough” and calling it a day.

I don’t settle for long with libraries and tools and I like to change them to achieve something that is funcional, performant and most importantly developer friendly.

Today I feel confident in sharing my findings, and showing you the way I would build a backend service in the environment I’m most comfortable in: Node.

You can take a look at the result here.


The features I will showcase are the following:

  • CRUD backed by Postgres
  • Get/Set interface backed by Redis
  • Job queue backed by Redis
  • Paseto tokens

I decided to not cram them together in a single use case, but to just have them all in one api, in their untangled form.

After all, these are just examples!


When building a backend service there are some things that I consider a must:

  • Structured logging
  • Some abstraction layer over node:http
  • Request validation
  • OpenAPI schema generation
  • Database migrations
  • Unit and integration testing
  • Dockerization

So, the project I built has all of these.


Javascript has a very, very large ecosystem. There are lots of good libraries! The most used ones are usually the way to go, but some of them have become ‘old’, ‘slow’ and perharps ‘ugly’ for modern standards.

Let’s name a few and provide alternatives:

Pnpm > NPM

Tired of waiting for dependencies to be installed?

Maybe you should take a look at pnpm.

It’s cool, it’s fast and beats npm’s…

the incredibles meme

Fastify > Express

Express, the king of APIs, has been diagnosed with senile dementia. I mean just look at the website! It is a very mature project and while v5 is in progress I see no exciting features in it.

While some years ago I considered it great, now it feels very clunky.

Perharps because I already took the red pill…

matrix meme

…which is Fastify. Imagine a library that does everything that Express does, but faster and with a betted DX. They really did a great job!

I’m talking about routing, validation, streams, an alternative approach to middlewares (hooks) and many more thanks to its plugin system.

Seriously, do yourself a favor and use Fastify instead! You will thank me later.


While Typescript is a much needed improvement, especially for developers who like to sleep on friday night instead of debugging a crash in prod, TSC’s compile times are quite slow… am I right Angular devs?

angular meme

SWC comes in our help, It’s a very fast Typescript compiler (and bundler) written in Rust. While it does’t type check you can use it together with tsc --noEmit to get a significant edge in build times.

It can also resolve absolute imports, something that TSC still won’t do (by design). This is a feature I really wanted to have for a long time!

I know, I know, you can solve absolute imports with libraries like tsc-alias, but to me they feel quite unstable. You don’t want to have bugs like this one in production, trust me!

  throw err;

Error: Cannot find module 'src/api'

Kysely > Knex

Let’s just say that if you want (or need) an ORM then you should stick with something like Prisma an call it a day, it’s just great.

Buf you don’t want (or need) an ORM then you would probably use Knex or directly the native driver.

Both node-postgres and Knex’s type support is error prone, as you need to explicitly specify the resulting type in your queries.

import { Pool } from "pg";

type Todo = {
    id: string;
    title: string;
    completed: boolean;

async function getAll(db: Pool) {
    const result = await db.query('SELECT * FROM todos;');
    return result.rows as Todo[];

But what if you could infer these types from your whole database definition?

Well you actually can do it in Knex, with some limitations.

Kysely redefines the limit, everything you can think of is typed: tables, fields, even sql operations!

import { Kysely } from "kysely";

type Todo = {
    id: string;
    title: string;
    completed: boolean;

type DB = {
  todos: Todo;

// Return type is of type Todo[]
// Also selectFrom() accepts only "todos"
async function getAll(db: Kysely<DB>) {
    return await db.selectFrom("todos").selectAll();

And it also manages migrations written in typescript!

import { Kysely, sql } from "kysely";

export async function up(db: Kysely<any>): Promise<void> {
  await db.schema
    .addColumn("id", "uuid", (col) =>
    .addColumn("title", "varchar(256)", (col) => 
    .addColumn("completed", "boolean", (col) => 

export async function down(db: Kysely<any>): Promise<void> {
  await db.schema.dropTable("todos").execute();

Typebox > Joi

When building APIs in Express i used Joi to validated requests. It was ok and worked well.

Unfortunaly this library misses type inference, so I switched to Typebox.

Not only Typebox supports that, but it also provides a JSON schema for the types you define, which makes it work great with Fastify.

If you don’t need JSON schema based validation, you should also take a look at Zod.

Pino > Winston

Node is not a fast runtime, but if I can maximise performance without impacting DX I’ll definitely do it!

While Winston still definitely works, Pino gives you a significant performance boost.

It’s the default logger for Fastify, and that should tell you just about everything.

Vitest > Jest

Jest has been my go to testing framework since I stopped using Mocha, Chai and friends in search of something more complete.

Vitest is faster that Jest. But that’s not the reason I decided to switch.

I also work on SPAs in Vite and Vitest plugs in beatifully. I don’t need to write tons of configuration just to make my tests run. I consider the no-globals-first approach a much needed improvement over Jest. I don’t like libraries exposing globals in Node, they remind me of polyfills…

As a downside, Vitest’s VSCode integration doesn’t seem to work very well, but the test ui combined with watch mode is enough to keep me productive.

The project

Enough talk, should we take a look at the code?

🌳 fastify-node-api
 ├─ 📁 .github
 ├─ 📁 src
 │  ├── 📁 api
 │  ├── 📁 domain
 │  ├── 📁 services
 │  ├── 📁 useCases
 │  ├── 📄 config.ts
 │  └── 📄 index.ts
 ├─ 📁 test
 │  ├── 📁 integration
 │  └── 📁 unit
 ├─ 📄 .eslintrc
 ├─ 📄 .swcrc
 ├─ 📄 Dockerfile
 ├─ 📄
 ├─ 📄 docker-compose.yaml
 ├─ 📄 package.json
 ├─ 📄 tsconfig.json
 └─ 📄 vitest.config.ts

I think the structure is pretty self-explanatory, but let’s go over it briefly.

The source code is split in folders, according to clean code principles:

  • api (all the routes)
  • domain (types and contracts)
  • services (concrete funcionality)
  • use cases (the business logic)

Alongside you then have the application entrypoint and its configuration.

Tests are split in unit and integration ones.

Various configuration files for the tools we are using.

And finally Docker stuff.

The following are the npm’s scripts:

"scripts": {
    "preinstall": "npx only-allow pnpm",
    "start": "NODE_ENV=production node dist/index.js",
    "start:dev": "nodemon -w src -e . --exec 'tsc && SWCRC=true node --inspect -r @swc-node/register src/index.ts | pino-pretty -c -t'",
    "build": "swc src -d dist",
    "test": "vitest run",
    "test:ui": "vitest --ui",
    "test:unit": "vitest run --dir test/unit",
    "test:integration": "vitest run --dir test/integration",
    "lint": "tsc && eslint src test"

Everything should be straight forward, except for start:dev where I’m:

  • enabling hot reloading with nodemon
  • type cheking with tsc
  • enabling debugging
  • running ts code with swc-node
  • printing logs in a human readable format with pino-pretty

Looks like I’m still talking after all, why don’t you take a look at it yourself?