FastAPI with Async SQLAlchemy, SQLModel, and Alembic

Last updated July 11th, 2023

This tutorial looks at how to work with SQLAlchemy asynchronously with SQLModel and FastAPI. We'll also configure Alembic for handling database migrations.

This tutorial assumes you have experience working with FastAPI and Postgres using Docker. Need help getting up to speed with FastAPI, Postgres, and Docker? Start with the following resources:

  1. Developing and Testing an Asynchronous API with FastAPI and Pytest
  2. Test-Driven Development with FastAPI and Docker

Contents

Project Setup

Start by cloning down the base project from the fastapi-sqlmodel-alembic repo:

$ git clone -b base https://github.com/testdrivenio/fastapi-sqlmodel-alembic
$ cd fastapi-sqlmodel-alembic

From the project root, create the images and spin up the Docker containers:

$ docker-compose up -d --build

Once the build is complete, navigate to http://localhost:8004/ping. You should see:

{
  "ping": "pong!"
}

Take a quick look at the project structure before moving on.

SQLModel

Next, let's add SQLModel, a library for interacting with SQL databases from Python code, with Python objects. Based on Python type annotations, it's essentially a wrapper on top of pydantic and SQLAlchemy, making it easy to work with both.

We'll also need Psycopg.

Add the two dependencies to project/requirements.txt:

fastapi==0.100.0
psycopg2-binary==2.9.6
sqlmodel==0.0.8
uvicorn==0.22.0

Create two new files in "project/app", db.py and models.py.

project/app/models.py:

from sqlmodel import SQLModel, Field


class SongBase(SQLModel):
    name: str
    artist: str


class Song(SongBase, table=True):
    id: int = Field(default=None, nullable=False, primary_key=True)


class SongCreate(SongBase):
    pass

Here, we defined three models:

  1. SongBase is the base model that the others inherit from. It has two properties, name and artist, both of which are strings. This is a data-only model since it lacks table=True, which means that it's only used as a pydantic model.
  2. Song, meanwhile, adds an id property to the base model. It's a table model, so it's a pydantic and SQLAlchemy model. It represents a database table.
  3. SongCreate is a data-only, pydantic model that will be used to create new song instances.

project/app/db.py:

import os

from sqlmodel import create_engine, SQLModel, Session


DATABASE_URL = os.environ.get("DATABASE_URL")

engine = create_engine(DATABASE_URL, echo=True)


def init_db():
    SQLModel.metadata.create_all(engine)


def get_session():
    with Session(engine) as session:
        yield session

Here, we:

  1. Initialized a new SQLAlchemy engine using create_engine from SQLModel. The major differences between SQLModel's create_engine and SQLAlchemy's version is that the SQLModel version adds type annotations (for editor support) and enables the SQLAlchemy "2.0" style of engines and connections. Also, we passed in echo=True so we can see the generated SQL queries in the terminal. This is always nice to enable in development mode for debugging purposes.
  2. Created a SQLAlchemy session.

Next, inside project/app/main.py, let's create the tables at startup, using the startup event:

from fastapi import FastAPI

from app.db import init_db
from app.models import Song

app = FastAPI()


@app.on_event("startup")
def on_startup():
    init_db()


@app.get("/ping")
async def pong():
    return {"ping": "pong!"}

It's worth noting that from app.models import Song is required. Without it, the song table will not be created.

To test, bring down the old containers and volumes, rebuild the images, and spin up the new containers:

$ docker-compose down -v
$ docker-compose up -d --build

Open the container logs via docker-compose logs web. You should see:

fastapi-sqlmodel-alembic-web-1  | CREATE TABLE song (
fastapi-sqlmodel-alembic-web-1  |   name VARCHAR NOT NULL,
fastapi-sqlmodel-alembic-web-1  |   artist VARCHAR NOT NULL,
fastapi-sqlmodel-alembic-web-1  |   id SERIAL NOT NULL,
fastapi-sqlmodel-alembic-web-1  |   PRIMARY KEY (id)
fastapi-sqlmodel-alembic-web-1  | )

Open psql:

$ docker-compose exec db psql --username=postgres --dbname=foo

psql (15.3 (Debian 15.3-1.pgdg120+1))
Type "help" for help.

foo=# \dt

        List of relations
 Schema | Name | Type  |  Owner
--------+------+-------+----------
 public | song | table | postgres
(1 row)

foo=# \q

With the table up, let's add a few new routes to project/app/main.py:

from fastapi import Depends, FastAPI
from sqlmodel import select
from sqlmodel import Session

from app.db import get_session, init_db
from app.models import Song, SongCreate

app = FastAPI()


@app.on_event("startup")
def on_startup():
    init_db()


@app.get("/ping")
async def pong():
    return {"ping": "pong!"}


@app.get("/songs", response_model=list[Song])
def get_songs(session: Session = Depends(get_session)):
    result = session.execute(select(Song))
    songs = result.scalars().all()
    return [Song(name=song.name, artist=song.artist, id=song.id) for song in songs]


@app.post("/songs")
def add_song(song: SongCreate, session: Session = Depends(get_session)):
    song = Song(name=song.name, artist=song.artist)
    session.add(song)
    session.commit()
    session.refresh(song)
    return song

Add a song:

$ curl -d '{"name":"Midnight Fit", "artist":"Mogwai"}' -H "Content-Type: application/json" -X POST http://localhost:8004/songs

{
  "id": 1,
  "name": "Midnight Fit",
  "artist": "Mogwai"
}

In your browser, navigate to http://localhost:8004/songs. You should see:

{
  "id": 1,
  "name": "Midnight Fit",
  "artist": "Mogwai"
}

Async SQLModel

Moving on, let's add async support to SQLModel.

First, bring down the containers and volumes:

$ docker-compose down -v

Update the database URI in docker-compose.yml, adding in +asyncpg:

environment:
  - DATABASE_URL=postgresql+asyncpg://postgres:postgres@db:5432/foo

Next, replace Psycopg with asyncpg:

asyncpg==0.28.0
fastapi==0.100.0
sqlmodel==0.0.8
uvicorn==0.22.0

Update project/app/db.py to use the async flavors of SQLAlchemy's engine and session:

import os

from sqlmodel import SQLModel, create_engine
from sqlmodel.ext.asyncio.session import AsyncSession, AsyncEngine

from sqlalchemy.orm import sessionmaker


DATABASE_URL = os.environ.get("DATABASE_URL")

engine = AsyncEngine(create_engine(DATABASE_URL, echo=True, future=True))

async def init_db():
    async with engine.begin() as conn:
        # await conn.run_sync(SQLModel.metadata.drop_all)
        await conn.run_sync(SQLModel.metadata.create_all)


async def get_session() -> AsyncSession:
    async_session = sessionmaker(
        engine, class_=AsyncSession, expire_on_commit=False
    )
    async with async_session() as session:
        yield session

Notes:

  1. We used the SQLAlchemy constructs -- i.e., AsyncEngine and AsyncSession -- since SQLModel does not have wrappers for them as of writing.
  2. We disabled expire on commit behavior by passing in expire_on_commit=False.
  3. metadata.create_all doesn't execute asynchronously, so we used run_sync to execute it synchronously within the async function.

Turn on_startup into an async function in project/app/main.py:

@app.on_event("startup")
async def on_startup():
    await init_db()

That's it. Rebuild the images and spin up the containers:

$ docker-compose up -d --build

Make sure the tables were created.

Finally, update the route handlers in project/app/main.py to use async execution:

from fastapi import Depends, FastAPI
from sqlmodel import select
from sqlmodel.ext.asyncio.session import AsyncSession

from app.db import get_session, init_db
from app.models import Song, SongCreate

app = FastAPI()


@app.on_event("startup")
async def on_startup():
    await init_db()


@app.get("/ping")
async def pong():
    return {"ping": "pong!"}


@app.get("/songs", response_model=list[Song])
async def get_songs(session: AsyncSession = Depends(get_session)):
    result = await session.execute(select(Song))
    songs = result.scalars().all()
    return [Song(name=song.name, artist=song.artist, id=song.id) for song in songs]


@app.post("/songs")
async def add_song(song: SongCreate, session: AsyncSession = Depends(get_session)):
    song = Song(name=song.name, artist=song.artist)
    session.add(song)
    await session.commit()
    await session.refresh(song)
    return song

Add a new song and make sure http://localhost:8004/songs works as expected.

Alembic

Finally, let's add Alembic into the mix to properly handle database schema changes.

Add it to the requirements file:

alembic==1.11.1
asyncpg==0.28.0
fastapi==0.100.0
sqlmodel==0.0.8
uvicorn==0.22.0

Remove the startup event from project/app/main.py since we no longer want the tables created at startup:

@app.on_event("startup")
async def on_startup():
    await init_db()

Again, bring down the existing containers and volumes:

$ docker-compose down -v

Spin the containers back up:

$ docker-compose up -d --build

Take a quick look at Using Asyncio with Alembic while the new images are building.

Once the containers are back up, initialize Alembic with the async template:

$ docker-compose exec web alembic init -t async migrations

Within the generated "project/migrations" folder, import SQLModel into script.py.mako, a Mako template file:

"""${message}

Revision ID: ${up_revision}
Revises: ${down_revision | comma,n}
Create Date: ${create_date}

"""
from alembic import op
import sqlalchemy as sa
import sqlmodel             # NEW
${imports if imports else ""}

# revision identifiers, used by Alembic.
revision = ${repr(up_revision)}
down_revision = ${repr(down_revision)}
branch_labels = ${repr(branch_labels)}
depends_on = ${repr(depends_on)}


def upgrade() -> None:
    ${upgrades if upgrades else "pass"}


def downgrade() -> None:
    ${downgrades if downgrades else "pass"}

Now, when a new migration file is generated it will include import sqlmodel.

Next, we need to update the top of project/migrations/env.py like so:

import asyncio
from logging.config import fileConfig

from sqlalchemy import pool
from sqlalchemy.engine import Connection
from sqlalchemy.ext.asyncio import async_engine_from_config
from sqlmodel import SQLModel                       # NEW

from alembic import context

from app.models import Song                         # NEW

# this is the Alembic Config object, which provides
# access to the values within the .ini file in use.
config = context.config

# Interpret the config file for Python logging.
# This line sets up loggers basically.
if config.config_file_name is not None:
    fileConfig(config.config_file_name)

# add your model's MetaData object here
# for 'autogenerate' support
# from myapp import mymodel
# target_metadata = mymodel.Base.metadata
target_metadata = SQLModel.metadata             # UPDATED

# other values from the config, defined by the needs of env.py,
# can be acquired:
# my_important_option = config.get_main_option("my_important_option")
# ... etc.

...

Here, we imported SQLModel and our song model. We then set target_metadata to our model's MetaData, SQLModel.metadata. For more on the target_metadata argument, check out Auto Generating Migrations from the official Alembic docs.

Update sqlalchemy.url in project/alembic.ini:

sqlalchemy.url = postgresql+asyncpg://postgres:postgres@db:5432/foo

To generate the first migration file, run:

$ docker-compose exec web alembic revision --autogenerate -m "init"

If all went well, you should see a new migration file in "project/migrations/versions" that looks something like this:

"""init

Revision ID: 842abcd80d3e
Revises:
Create Date: 2023-07-10 17:10:45.380832

"""
from alembic import op
import sqlalchemy as sa
import sqlmodel


# revision identifiers, used by Alembic.
revision = '842abcd80d3e'
down_revision = None
branch_labels = None
depends_on = None


def upgrade() -> None:
    # ### commands auto generated by Alembic - please adjust! ###
    op.create_table('song',
    sa.Column('name', sqlmodel.sql.sqltypes.AutoString(), nullable=False),
    sa.Column('artist', sqlmodel.sql.sqltypes.AutoString(), nullable=False),
    sa.Column('id', sa.Integer(), nullable=False),
    sa.PrimaryKeyConstraint('id')
    )
    # ### end Alembic commands ###


def downgrade() -> None:
    # ### commands auto generated by Alembic - please adjust! ###
    op.drop_table('song')
    # ### end Alembic commands ###

Apply the migration:

$ docker-compose exec web alembic upgrade head

Ensure you can add a song.

Let's quickly test a schema change. Update the SongBase model in project/app/models.py:

class SongBase(SQLModel):
    name: str
    artist: str
    year: Optional[int] = None

Don't forget the import:

from typing import Optional

Create a new migration file:

$ docker-compose exec web alembic revision --autogenerate -m "add year"

Update the upgrade and downgrade functions from the auto generated migration file like so:

def upgrade():
    # ### commands auto generated by Alembic - please adjust! ###
    op.add_column('song', sa.Column('year', sa.Integer(), nullable=True))
    op.create_index(op.f('ix_song_year'), 'song', ['year'], unique=False)
    # ### end Alembic commands ###


def downgrade():
    # ### commands auto generated by Alembic - please adjust! ###
    op.drop_index(op.f('ix_song_year'), table_name='song')
    op.drop_column('song', 'year')
    # ### end Alembic commands ###

Apply the migration:

$ docker-compose exec web alembic upgrade head

Update the route handlers:

@app.get("/songs", response_model=list[Song])
async def get_songs(session: AsyncSession = Depends(get_session)):
    result = await session.execute(select(Song))
    songs = result.scalars().all()
    return [Song(name=song.name, artist=song.artist, year=song.year, id=song.id) for song in songs]


@app.post("/songs")
async def add_song(song: SongCreate, session: AsyncSession = Depends(get_session)):
    song = Song(name=song.name, artist=song.artist, year=song.year)
    session.add(song)
    await session.commit()
    await session.refresh(song)
    return song

Test:

$ curl -d '{"name":"Midnight Fit", "artist":"Mogwai", "year":"2021"}' -H "Content-Type: application/json" -X POST http://localhost:8004/songs

Conclusion

In this tutorial, we covered how to configure SQLAlchemy, SQLModel, and Alembic to work with FastAPI asynchronously.

If you're looking for more challenges, check out all of our FastAPI tutorials and courses.

You can find the source code in the fastapi-sqlmodel-alembic repo. Cheers!

Featured Course

Test-Driven Development with FastAPI and Docker

In this course, you'll learn how to build, test, and deploy a text summarization service with Python, FastAPI, and Docker. The service itself will be exposed via a RESTful API and deployed to Heroku with Docker.

Featured Course

Test-Driven Development with FastAPI and Docker

In this course, you'll learn how to build, test, and deploy a text summarization service with Python, FastAPI, and Docker. The service itself will be exposed via a RESTful API and deployed to Heroku with Docker.