Aiosqlitepool
🛡️A resilient, high-performance asynchronous connection pool layer for SQLite, designed for efficient and scalable database operations.
Install / Use
/learn @slaily/AiosqlitepoolREADME
aiosqlitepool
<p> <a> <img src="https://img.shields.io/badge/stability-stable-green.svg" alt="Stable version"> </a> <a href="https://pypi.org/project/aiosqlitepool" target="_blank"> <img src="https://img.shields.io/pypi/v/aiosqlitepool?color=%2334D058&label=pypi%20package" alt="Package version"> </a> <a href="https://pypi.org/project/aiosqlitepool" target="_blank"> <img src="https://img.shields.io/pypi/pyversions/aiosqlitepool.svg?color=%2334D058" alt="Supported Python versions"> </a> <a href="https://github.com/slaily/aiosqlitepool/blob/main/LICENSE" target="_blank"> <img src="https://img.shields.io/badge/License-MIT-yellow.svg" alt="Supported Python versions"> </a> </p>aiosqlitepool is a high-performance connection pool for asyncio SQLite applications. By managing a pool of reusable database connections, it eliminates connection overhead and delivers significant performance gains.
Important: aiosqlitepool is not a SQLite database driver.
It's a performance-boosting layer that works with an asyncio driver like aiosqlite, not as a replacement for it.
aiosqlitepool in three points:
- Eliminates connection overhead: It avoids repeated database connection setup (syscalls, memory allocation) and teardown (syscalls, deallocation) by reusing long-lived connections.
- Faster queries via "hot" cache: Long-lived connections keep SQLite's in-memory page cache "hot." This serves frequently requested data directly from memory, speeding up repetitive queries and reducing I/O operations.
- Maximizes concurrent throughput: Allows your application to process significantly more database queries per second under heavy load.
Table of contents
- Installation
- Usage
- Configuration
- How it works
- Do you need a connection pool with SQLite?
- Benchmarks
- Compatibility
- License
Installation
aiosqlitepool requires the aiosqlite driver to be installed as a peer dependency.
Install with your preferred package manager:
pip
pip install aiosqlite aiosqlitepool
uv
uv add aiosqlite aiosqlitepool
Poetry
poetry add aiosqlite aiosqlitepool
Usage
Basic usage
You must provide a connection_factory - an async function that creates and returns a database connection:
import asyncio
import aiosqlite
from aiosqlitepool import SQLiteConnectionPool
async def connection_factory():
return await aiosqlite.connect("basic.db")
async def main():
pool = SQLiteConnectionPool(connection_factory)
async with pool.connection() as conn:
await conn.execute("CREATE TABLE IF NOT EXISTS users (name TEXT)")
await conn.execute("INSERT INTO users VALUES (?)", ("Alice",))
# You must handle transaction management yourself
await conn.commit()
cursor = await conn.execute("SELECT name FROM users")
row = await cursor.fetchone()
print(f"Found user: {row[0]}")
await pool.close()
if __name__ == "__main__":
asyncio.run(main())
Note: The pool manages connections, not transactions. You're responsible for calling commit() or rollback() as needed. The pool ensures connections are safely reused but doesn't interfere with your transaction logic.
Using as Context Manager
The pool can be used as an async context manager for automatic cleanup:
async def main():
async with SQLiteConnectionPool(create_connection) as pool:
async with pool.connection() as conn:
# Do database work
pass
# Pool is automatically closed
High-performance SQLite connection configuration
For high-performance applications, configure your connection factory with optimized SQLite pragmas.
import aiosqlite
from aiosqlitepool import SQLiteConnectionPool
async def sqlite_connection() -> aiosqlite.Connection:
# Connect to your database
conn = await aiosqlite.connect("your_database.db")
# Apply high-performance pragmas
await conn.execute("PRAGMA journal_mode = WAL")
await conn.execute("PRAGMA synchronous = NORMAL")
await conn.execute("PRAGMA cache_size = 10000")
await conn.execute("PRAGMA temp_store = MEMORY")
await conn.execute("PRAGMA foreign_keys = ON")
await conn.execute("PRAGMA mmap_size = 268435456")
return conn
async def main():
# Initialize the connection pool with your high-performance connection factory
pool = SQLiteConnectionPool(
connection_factory=sqlite_connection,
)
# Use the pool
async with pool.connection() as conn:
# Your database operations here
# cursor = await conn.execute("SELECT ...")
# rows = await cursor.fetchall()
pass
# Clean up
await pool.close()
PRAGMA journal_mode = WAL - Writes go to a separate WAL file, reads continue from main database. Multiple readers can work simultaneously with one writer.
PRAGMA synchronous = NORMAL - SQLite syncs to disk at critical moments, but not after every write. ~2-3x faster writes than FULL synchronization.
PRAGMA cache_size = 10000 - Keeps 10,000 database pages (~40MB) in memory. Frequently accessed data served from RAM, not disk
PRAGMA temp_store = MEMORY - Stores temporary tables, indexes, and sorting operations in RAM. Eliminates disk I/O for temporary operations
PRAGMA foreign_keys = ON - Enforces foreign key constraints automatically. Prevents data corruption, reduces application-level checks
PRAGMA mmap_size = 268435456 - Maps database file directly into process memory space. Reduces system calls, faster access to large databases
FastAPI integration
This section demonstrates an effective pattern for integrating aiosqlitepool with FastAPI applications.
The pattern addresses three key requirements:
- Lifecycle management: The pool is created during application startup and gracefully closed on shutdown using FastAPI's
lifespancontext manager - Global access: The pool is stored in the application's state, making it accessible to all route handlers
- Dependency injection: A reusable dependency function provides clean access to pooled connections with automatic resource management
import asyncio
from typing import AsyncGenerator
from contextlib import asynccontextmanager
import aiosqlite
from aiosqlitepool import SQLiteConnectionPool
from fastapi import (
Request,
Depends,
FastAPI,
HTTPException,
)
async def sqlite_connection() -> aiosqlite.Connection:
"""A factory for creating new connections."""
conn = await aiosqlite.connect("app.db")
return conn
@asynccontextmanager
async def lifespan(app: FastAPI):
"""
Manage the connection pool's lifecycle.
The pool is created when the application starts and gracefully closed when it stops.
"""
db_pool = SQLiteConnectionPool(connection_factory=sqlite_connection, pool_size=10)
app.state.db_pool = db_pool
yield
await db_pool.close()
app = FastAPI(lifespan=lifespan)
async def get_db_connection(request: Request) -> AsyncGenerator[Connection]:
"""
A dependency that provides a connection from the pool.
It accesses the pool from the application state.
"""
db_pool = request.app.state.db_pool
async with db_pool.connection() as conn:
yield conn
@app.get("/users/{user_id}")
async def get_user(
user_id: int, db_conn: Connection = Depends(get_db_connection)
) -> dict[str, any]:
cursor = await db_conn.execute("SELECT * FROM users WHERE id = ?", (user_id,))
user = await cursor.fetchone()
if not user:
raise HTTPException(status_code=404, detail="User not found")
return dict(user)
Configuration
SQLiteConnectionPool accepts these parameters:
Required parameters
connection_factory- An async function that creates and returns a new database connection. This function will be called whenever the pool needs to create a new connection.
Optional parameters
-
pool_size(int) - Maximum number of connections to maintain in the pool (default:5) -
acquisition_timeout(int) - Maximum seconds to wait for an available connection (default:30) -
idle_timeout(int) - Maximum seconds a connection can remain idle before replacement (default:86400- 24 hours)
Recommended configurations
Most web applications work well with these settings:
pool = SQLiteConnectionPool(
connection_factory,
pool_size=10,
acquisition_timeout=30
)
For read-heavy workloads like analytics or reporting:
pool = SQLiteConnectionPool(
connection_factory,
pool_size=20,
acquisition_timeout=15
)
For write-heavy workloads:
pool = SQLiteConnectionPool(
connection_factory,
pool_size=5,
acquisition_timeout=60
)
How it works
The pool automatically:
- Creates connections on-demand up to the pool size limit
- Reuses idle connections to avoid creation overhead
- Performs health checks to detect broken connections
- Rolls back any uncommitted transactions when connections are returned
- Replaces connections that have been idle too long
Do you need a connection pool with SQLite?
For server-based databases like PostgreSQL or M
