Evolving the App to Multi Tenant
Evolving the App to a Multi Tenant Architecture
Multi-tenant Applications#
Multi-tenant applications are apps in which multiple customers or customer groups use the application but are separated from one another. Multi-tenancy is very common in SaaS applications because it allows an application to be created once and offered to many customers.
Up until now, we have been using a single-tenant application. All of our data is available to any user. Supporting multiple customers (in our case schools) adds complexity but is needed in a lot of real world situations.
In this lesson, we'll evolve the application to a multi-tenant architecture. Here's the plan:
Each school will get their own database. This will keep their data isolated from other schools.
When we create JWTs via Auth0, we'll add metadata into the JWT identifying the user's school.
For a given API call coming into the Express backend, we'll parse the JWT and determine the user's school and connect to the proper database.
We'll create a special "catalog" database that will contain a table of schools and we'll use that to help manage the other databases with a new Knex migration script.
The Catalog Database#
Let's get going by creating the new catalog
database. From pgAdmin, right-click on Databases and choose Create > Database. Use the name "catalog" and hit Save.
Postgres database names should be lower case
Next, we are going to separate our migrations. We need to have one set of migrations for the "app database" (ie our old database) and a new set of migrations for the Catalog database. Under the backend project's database directory, create two new directories:
$ mkdir app_migrations
$ mkdir catalog_migrations
Continuing, let's do some re-organization and cleanup. Move the existing migrations from /migrations
into /database/app_migrations
and delete the old migrations directory and .knexfile
.
Now, we need to create a schools table in the new catalog database. Stub out the migration from the command line. This time, we specify a migrations-directory
so that we can keep catalog related migrations in the /database/catalog_migrations/
directory.
From the root of the backend project:
$ knex migrate:make create-schools-table --migrations-directory database/catalog_migrations
The project should look something like this:

Update the new create-schools-table
with the following migration code to create a schools table with school_id
, school_name
, and school_database
columns. The school_database
column will later be used to determine which database to use for which school (tenant).
exports.up = function (knex) {
return knex.schema.createTable('schools', function (table) {
table.increments('school_id')
table.string('school_name').notNullable()
table.string('school_database').notNullable()
})
}
exports.down = function (knex) {
return knex.schema.dropTable('schools')
}
A Custom Migration Script#
So far, we have only had one database to deal with and we have gotten by with running the knex migrate:up
command from the command line. Moving forward, we'll have one database for every tenant and we need something more automated. Knex migrations also have a JavaScript API that we can use, so let's write a standalone Node script that will handle all the migrations.
Create a file called migrate.js
in the root of the backend project. Here's the script:
const knex = require('knex')
require('dotenv').config()
const getConfig = (databaseName) => {
return {
client: 'postgresql',
connection: {
database: databaseName,
host: process.env.PG_HOST,
user: process.env.PG_USER,
password: process.env.PG_PASSWORD,
},
}
}
const migrateDatabases = async () => {
const defaultKnex = knex(getConfig('postgres'))
const catalogDb = await defaultKnex
.select()
.from('pg_database')
.where({ datname: 'catalog' })
.first()
if (!catalogDb) {
await defaultKnex.raw(`create database catalog`)
}
const catalogKnex = knex(getConfig('catalog'))
await catalogKnex.migrate.latest({
directory: './database/catalog_migrations',
})
const schools = await catalogKnex.select().from('schools')
for (let i = 0; i < schools.length; i++) {
const school = schools[i]
const db = await catalogKnex
.select()
.from('pg_database')
.where({ datname: school.school_database })
.first()
if (!db) {
await catalogKnex.raw(`create database ${school.school_database}`)
}
const appKnex = knex(getConfig(school.school_database))
await appKnex.migrate.latest({
directory: './database/app_migrations',
})
}
process.exit(0)
}
;(async () => {
migrateDatabases()
})()
In the previous lesson, we used the dotenv
package to apply env vars from our .env
file. We're doing the same here, but we need to add additional vars to the .env
file.
Add the following to .env
:
PG_HOST=localhost
PG_PASSWORD=<your pg password>
PG_USER=postgres
Go ahead and give the script a try by running node migrate
from the command line:
$ node migrate
FS-related option specified for migration configuration. This resets migrationSource to default FsMigrations
To make things interesting, let's insert a couple of schools into the catalog schools table. In pgAdmin, right click on the catalog database and choose Query Tool. From here, you can run the following insert statement:
insert into schools (school_name, school_database)
values
('Pinewood Elementary', 'pinewood_elementary'),
('Sunnyside High School', 'pinewood_elementary')
Then run the migrate script again: $ node migrate
. This time, new databases for each school should be created and updated with the necessary tables from the app_migrations
. You can refresh the databases in pgAdmin to see the results:

This page is a preview of Fullstack Svelte