Building next.js app with next-auth and Prisma.js on top Postgres. Part 1. create-next-app, prisma, postgres docker

Updated on: Wed Jul 14 2021

This post is just some notes which I made to return here when I start my next project. Most of the information in this post I got from amazing youtube tutorial provided by prisma.io on their channel. Special thanks Xiaoru "Leo" Li for that video.

Bootstrap project

First steps should be self-explained. We will create new next project and install next auth, type definitions for it and prisma cli. We will need prisma cli to bootstrap prisma app inside our brand new myapp.

copied bash

npx create-next-app myapp
cd myapp
yarn add next-auth
yarn add -D @prisma/cli @types/next-auth

I personally like to have all my app source code in src folder. Nextjs perfectly handles this without any hassle so let's just move all necessary things there right away:

copied bash

mkdir src && mv pages src && mv styles src

Adding typescript support

To power up our project with typescript support let's do couple more things right away:

copied bash

touch tsconfig.json
yarn add -D typescript

Then rename all our .js files to .ts and if they contain react jsx components to .tsx and run:

copied bash

yarn dev

Next cli will auto detect tsconfig.json which we've created step before and fill with needed configuration. More information in nextjs docs.

How to avoid relative paths imports like import fun from '../../../../some/package' and be able to do absolute imports in a way import fun from '@/some/package' we have to extend our tsconfig.json compilerOptions settings a bit:

copied json

{
  "compilerOptions": {
    ...
    "baseUrl": ".",
    "paths": {
      "@/*": ["src/*"]
    }
    ...
  },
  ...
}

Running dev database in docker

Next we will need postgres db server running somewhere. During development I usually use docker for it. Setting up docker is out of this post's scope. Find information about it separately. Postgres db running could be setup via next instructions.

copied bash

docker pull postgres:12.3-alpine
docker run -d --rm --name myapp-postgres -e POSTGRES_PASSWORD=123321 -p 8085:5432 postgres:12.3-alpine
# this instruction will run postgres in docker container
# give it a name "myapp-postgres" for us to manage it
# expose port 8085
# and it will detach from process. Process will run in background

docker kill myapp-postgres
# to kill the process

For convenience I like to add these scripts in package.json:

copied json

"start-db": "docker run -d --rm --name myapp-postgres -e POSTGRES_PASSWORD=123321 -p 8084:5432 postgres:12.3-alpine",
"stop-db": "docker kill myapp-postgres",
"logs-db": "docker logs -f myapp-postgres"

But also we can do a step further and extract our database managing functionality to docker-compose file. This will give us easier and more flexible way to hassle with it in the future. Next listing goes to docker-compose.local.yml:

copied text

services:
  db:
    image: postgres:12.3-alpine
    environment:
      - POSTGRES_USER=postgres
      - POSTGRES_DB=myapp_db
      - POSTGRES_PASSWORD=123321
    volumes:
      - myapp_db-volume:/var/lib/postgresql/data
    ports:
      - 8084:5432

volumes:
  myapp_db-volume:

Here we basically put all information regarding ports and local db user credentials and also one important thing - we described database volume myapp_db-volume and now it points to /var/lib/postgresql/data folder so our database will be kept on filesystem even when docker container will be stopped.

With this in mind we can tweak our scripts section in package json for it to utilise our brand new configuration in docker-compose.local.yml:

copied bash

"start-db": "docker-compose -f docker-compose.local.yml up -d",
"stop-db": "docker-compose -f docker-compose.local.yml down",
"logs-db": "docker logs -f myapp_db_1"

docker-compose is a cli tool to run docker containers according provided in docker-compose.yml files configuration. -f option specifies which file configuration will be taken to run a container. By default it is docker-compose.yml but we created files with different name to be more explicit in terms of which environment this container should run in. -d option stands for running container in background mode. In logs-db command we changed the name of container because now this name is produced by docker-compose cli and it is basically service_name + container's index.

Bootstrap prisma and add schema for next-auth

After that setup we will have postgres connection string as following:

copied bash

postgres://postgres:123321@localhost:8085/myapp_db?schema=public

That url should go to .env for prisma.js to be aware of it and .env.local for nextjs.

Next let's init prisma inside our app. This command will bootstrap for us prisma folder with basic schema.prisma file.

copied bash

npx prisma init

Database schema for next-auth to operate could be found in the next-auth docs section. For example here it is for postgres. And the following listing is implementaion of this db schema as prisma schema configuration.

copied ts/tsx

model User {
  id            Int       @default(autoincrement()) @id
  name          String?
  email         String?   @unique
  emailVerified DateTime? @map(name: "email_verified")
  image         String?
  createdAt     DateTime  @default(now()) @map(name: "created_at")
  updatedAt     DateTime  @default(now()) @map(name: "updated_at")

  @@map(name: "users")
}

model Account {
  id                 Int       @default(autoincrement()) @id
  compoundId         String    @unique @map(name: "compound_id")
  userId             Int       @map(name: "user_id")
  providerType       String    @map(name: "provider_type")
  providerId         String    @map(name: "provider_id")
  providerAccountId  String    @map(name: "provider_account_id")
  refreshToken       String?   @map(name: "refresh_token")
  accessToken        String?   @map(name: "access_token")
  accessTokenExpires DateTime? @map(name: "access_token_expires")
  createdAt          DateTime  @default(now()) @map(name: "created_at")
  updatedAt          DateTime  @default(now()) @map(name: "updated_at")

  @@index([providerAccountId], name: "providerAccountId")
  @@index([providerId], name: "providerId")
  @@index([userId], name: "userId")

  @@map(name: "accounts")
}

model Session {
  id           Int      @default(autoincrement()) @id
  userId       Int      @map(name: "user_id")
  expires      DateTime
  sessionToken String   @unique @map(name: "session_token")
  accessToken  String   @unique @map(name: "access_token")
  createdAt    DateTime @default(now()) @map(name: "created_at")
  updatedAt    DateTime @default(now()) @map(name: "updated_at")

  @@map(name: "sessions")
}

model VerificationRequest {
  id         Int      @default(autoincrement()) @id
  identifier String
  token      String   @unique
  expires    DateTime
  createdAt  DateTime  @default(now()) @map(name: "created_at")
  updatedAt  DateTime  @default(now()) @map(name: "updated_at")

  @@map(name: "verification_requests")
}

This listing should go to prisma/schema.prisma bootstrapped by npx prisma init. If we have our postgres database running we can create our first schema migration using prisma functionality.

copied bash

prisma migrate dev --preview-feature

This command will create migration.sql file with all needed sql instructions to fulfil the requirements in our schema. Check it out, it's quite interesting. This command will create database for you and run migration on top of it. If you'll stop your postgres docker container you will need next time to create database manually.

copied bash

docker exec -it myapp-postgres bash
psql -U postgres

# then in psql console
CREATE DATABASE myapp_db;

Sidenote. You can always run your database locally without docker or use some database as service platforms provided by aws or digitalocean or somebody else.

So anyway you have your database running and migrated. Now you can also run one fancy thing from prisma called prisma studio.

copied bash

npx prisma studio

It will run pretty looking UI to manage data in your database.

At this step initial setup is finished and let's go and actually implement some functionality!