背景简介
在一个全新 FastAPI + SQLAlchemy + Alembic + PostgreSQL 架构项目中,创建一张 user_basic_info 表(字段为 id, name, password, primary_email),并建立一套可同时用于 dev / prod 环境的数据库表结构与迁移管理机制。
前置信息
- SQLAlchemy 2.0.35
- Alembic 1.14.0
- PostgreSQL 17
详细信息
文件夹架构
backend/
|-- app/
| |-- core/
| | |-- config.py # 配置管理(dotenv)
| | |-- database.py # 数据库连接(SQLAlchemy + asyncpg)
| | |-- __init__.py
| |-- models/
| | |-- user_basic_info.py # 数据模型定义
| | |-- __init__.py
| |-- __init__.py
|-- alembic/
| |-- versions/ # 自动生成的迁移文件目录
| |-- env.py # Alembic 环境配置
|-- docs/
|-- test/
|-- .env # 环境配置(dev)
|-- .env.prod # 环境配置(prod)
|-- alembic.ini # Alembic 主配置文件
|-- pyproject.toml
|-- Dockerfile
|-- requirements.txt
|-- requirements-dev.txt
|-- README.md
新建数据库模块
- 数据库模型定义
app/models/user_basic_info.py
from sqlalchemy import String
from sqlalchemy.orm import Mapped, mapped_column
from app.core.database import Base
import uuid
class UserBasicInfo(Base):
__tablename__ = "user_basic_info"
id: Mapped[str] = mapped_column(
String(36), primary_key=True, default=lambda: str(uuid.uuid4())
)
name: Mapped[str] = mapped_column(String(100), nullable=False)
password: Mapped[str] = mapped_column(String(255), nullable=False)
primary_email: Mapped[str] = mapped_column(String(255), unique=True, nullable=False)
- 数据库连接与会话管理
app/core/database.py
from sqlalchemy.ext.asyncio import create_async_engine, async_sessionmaker
from sqlalchemy.orm import DeclarativeBase
from app.core.config import settings
class Base(DeclarativeBase):
pass
# 创建异步引擎
engine = create_async_engine(
settings.DATABASE_URL,
echo=settings.DB_ECHO,
future=True,
)
# 会话工厂
AsyncSessionLocal = async_sessionmaker(
engine, expire_on_commit=False, autoflush=False, autocommit=False
)
# 获取 Session 的依赖(FastAPI 可用)
async def get_db():
async with AsyncSessionLocal() as session:
yield session
- 环境配置文件
app/core/config.py
from pydantic_settings import BaseSettings, SettingsConfigDict
class Settings(BaseSettings):
# 数据库配置
DATABASE_URL: str
DB_ECHO: bool = False
model_config = SettingsConfigDict(
env_file=".env", env_file_encoding="utf-8", extra="ignore"
)
settings = Settings()
- 环境文件(开发 / 生产通用)
.env
# --- Database ---
DATABASE_URL=postgresql+asyncpg://postgres:yourpassword@localhost:5432/yourdb
DB_ECHO=True
新建 alembic 配置
- 在项目根目录执行 alembic 初始化指令
alembic init alembic
alembic init alembic
Creating directory '/home/myserver/pers/pers-website/backend/alembic' ... done
Creating directory '/home/myserver/pers/pers-website/backend/alembic/versions' ... done
Generating /home/myserver/pers/pers-website/backend/alembic/script.py.mako ... done
Generating /home/myserver/pers/pers-website/backend/alembic/env.py ... done
Generating /home/myserver/pers/pers-website/backend/alembic/README ... done
Generating /home/myserver/pers/pers-website/backend/alembic.ini ... done
Please edit configuration/connection/logging settings in '/home/myserver/pers/pers-website/backend/alembic.ini' before proceeding.
需安装 alembic 库: pip install alembic
- 确保以下配置存在
[alembic]
script_location = alembic
sqlalchemy.url = postgresql+asyncpg://postgres:yourpassword@localhost:5432/yourdb
实际上我们会在
env.py动态加载.env,所以这里存在即可,可以留空。
- 修改
alembic/env.py, 替换为以下完整内容(支持asyncpg)
import asyncio
from logging.config import fileConfig
from sqlalchemy import pool
from sqlalchemy.ext.asyncio import async_engine_from_config
from alembic import context
from app.core.config import settings
from app.core.database import Base
from app.models import user_basic_info # 导入模型
# this is the Alembic Config object, which provides access to the values within the .ini file
config = context.config
# Interpret the config file for Python logging.
if config.config_file_name is not None:
fileConfig(config.config_file_name)
# 添加模型的 metadata
target_metadata = Base.metadata
def get_url():
return settings.DATABASE_URL
def run_migrations_offline() -> None:
"""Run migrations in 'offline' mode."""
url = get_url()
context.configure(
url=url,
target_metadata=target_metadata,
literal_binds=True,
compare_type=True,
)
with context.begin_transaction():
context.run_migrations()
def do_run_migrations(connection):
context.configure(connection=connection, target_metadata=target_metadata, compare_type=True)
with context.begin_transaction():
context.run_migrations()
async def run_migrations_online():
"""Run migrations in 'online' mode."""
connectable = async_engine_from_config(
config.get_section(config.config_ini_section, {}),
prefix="sqlalchemy.",
poolclass=pool.NullPool,
url=get_url(),
)
async with connectable.connect() as connection:
await connection.run_sync(do_run_migrations)
await connectable.dispose()
if context.is_offline_mode():
run_migrations_offline()
else:
asyncio.run(run_migrations_online())
生成与应用迁移文件
- 在根目录执行指令生成迁移文件
alembic revision --autogenerate -m "create user_basic_info table"
INFO [alembic.runtime.migration] Context impl PostgresqlImpl.
INFO [alembic.runtime.migration] Will assume transactional DDL.
INFO [alembic.autogenerate.compare] Detected added table 'user_basic_info'
Generating /home/myserver/pers/pers-website/backend/alembic/versions/ebbc6e3a391e_create_user_basic_info_table.py ... done
- 应用迁移
alembic upgrade head
INFO [alembic.runtime.migration] Context impl PostgresqlImpl.
INFO [alembic.runtime.migration] Will assume transactional DDL.
INFO [alembic.runtime.migration] Running upgrade -> ebbc6e3a391e, create user_basic_info table
- 生成的迁移文件会出现在
alembic/version目录下
alembic
├── env.py
├── __pycache__
│ └── env.cpython-311.pyc
├── README
├── script.py.mako
└── versions
├── ebbc6e3a391e_create_user_basic_info_table.py
└── __pycache__
└── ebbc6e3a391e_create_user_basic_info_table.cpython-311.pyc
验证结果
- 登录数据库查看表结构【PostgreSQL - 查看表结构】
Table "public.user_basic_info"
Column | Type | Collation | Nullable | Default
---------------+------------------------+-----------+----------+---------
id | character varying(36) | | not null |
name | character varying(100) | | not null |
password | character varying(255) | | not null |
primary_email | character varying(255) | | not null |
Indexes:
"user_basic_info_pkey" PRIMARY KEY, btree (id)
"user_basic_info_primary_email_key" UNIQUE CONSTRAINT, btree (primary_email)
以上便是本文的全部内容,感谢您的阅读,如遇到任何问题,欢迎在评论区留言讨论。