Announcing ic-sql-migrate, a Rust library that helps you run schema migrations on SQL databases

Hi! @sgaflv has built the excellent wasi2ic tool that allows canisters compiled to WASI to run on ICP. This opens the door for running any third party tool or platform that supports compiling to WASI, including SQLite.

For more information on how to set that up, see this post:

ic-sql-migrate

https://crates.io/crates/ic-sql-migrate

To support the efforts of @sgaflv and wasm-forge, I built ic-sql-migrate, a lightweight database migration library with support for SQLite and Turso databases.

Features

  • :rocket: Multi-Database Support: Works with SQLite (via ic-rusqlite) and Turso databases
  • :package: Compile-Time Embedding: Migration files are embedded into your canister at compile time
  • :counterclockwise_arrows_button: Automatic Migration: Tracks and applies migrations automatically on canister init and upgrade
  • :locked: Transactional: All migrations run in transactions for safety
  • :building_construction: ICP Native: Designed specifically for Internet Computer canisters

Quick Start

Prerequisites

IMPORTANT: You must enable exactly one database feature (sqlite or turso) for this library to work. There is no default feature.

In addition to having the Rust toolchain setup and dfx, you need to install the wasi2ic tool that replaces WebAssembly System Interface (WASI) specific function calls with their corresponding polyfill implementations. This allows you to run Wasm binaries compiled for wasm32-wasi on the Internet Computer.

cargo install wasi2ic

Configure dfx.json

You also need to configure your dfx.json to compile for the wasm32-wasip1 target and use wasi2ic to process the binary:

{
  "canisters": {
    "your_canister": {
      "candid": "your_canister.did",
      "package": "your_canister",
      "type": "custom",
      "build": [
        "cargo build --target wasm32-wasip1 --release",
        "wasi2ic target/wasm32-wasip1/release/your_canister.wasm target/wasm32-wasip1/release/your_canister-wasi2ic.wasm"
      ],
      "wasm": "target/wasm32-wasip1/release/your_canister-wasi2ic.wasm"
    }
  }
}

Installation

Add to both [dependencies] and [build-dependencies] in your Cargo.toml:

# For SQLite support
# Note: You MUST specify either "sqlite" or "turso" feature - there is no default
[dependencies]
ic-sql-migrate = { version = "0.0.4", features = ["sqlite"] }
ic-rusqlite = { version = "0.4.2", features = ["precompiled"], default-features = false }
ic-cdk = "0.18.7"

[build-dependencies]
ic-sql-migrate = "0.0.4"

Or for Turso:

# For Turso support
# Note: You MUST specify either "sqlite" or "turso" feature - there is no default
[dependencies]
ic-sql-migrate = { version = "0.0.4", features = ["turso"] }
turso = "0.1.4"
ic-cdk = "0.18.7"

[build-dependencies]
ic-sql-migrate = "0.0.4"

Important:

  • You MUST choose exactly one database feature (sqlite or turso)
  • The features are mutually exclusive (cannot use both)
  • There is no default feature - the library will not work without selecting one

Deployment Configuration

dfx.json Setup (Required for SQLite)

For SQLite support, you need to configure your dfx.json to compile for the wasm32-wasip1 target and use wasi2ic to process the binary:

{
  "canisters": {
    "your_canister": {
      "candid": "your_canister.did",
      "package": "your_canister",
      "type": "custom",
      "build": [
        "cargo build --target wasm32-wasip1 --release",
        "wasi2ic target/wasm32-wasip1/release/your_canister.wasm target/wasm32-wasip1/release/your_canister-wasi2ic.wasm"
      ],
      "wasm": "target/wasm32-wasip1/release/your_canister-wasi2ic.wasm"
    }
  }
}

This configuration:

  1. Compiles your canister for the wasm32-wasip1 target (required for SQLite)
  2. Uses wasi2ic to convert WASI function calls to IC-compatible polyfills
  3. Points dfx to the processed WASM file for deployment

Note: Turso canisters use the standard wasm32-unknown-unknown target and don’t require wasi2ic processing.

Basic Usage

1. Create migration files

Create a migrations/ directory with SQL files. Each migration should be:

  • Numbered sequentially (e.g., 000_initial.sql, 001_add_users.sql)
  • Idempotent when possible (use IF NOT EXISTS clauses)
  • Forward-only (this library doesn’t support rollbacks)
-- migrations/000_initial.sql
CREATE TABLE IF NOT EXISTS users (
    id INTEGER PRIMARY KEY AUTOINCREMENT,
    name TEXT NOT NULL,
    email TEXT
);

2. Set up build.rs

The list() function scans your migrations directory at compile time and generates code to embed the SQL files into your canister binary. This makes the migrations available as static data in your compiled canister:

fn main() {
    ic_sql_migrate::list(Some("migrations")).unwrap();
}

3. Use in your canister

The include!() macro incorporates the migrations that were discovered by list() in step 2. It creates a static array of Migration objects containing your SQL files’ contents, which you can then pass to the up() function to execute them:

SQLite Example:

use ic_cdk::{init, post_upgrade, pre_upgrade};
use ic_rusqlite::{close_connection, with_connection, Connection};

static MIGRATIONS: &[ic_sql_migrate::Migration] = ic_sql_migrate::include!();

fn run_migrations() {
    with_connection(|mut conn| {
        let conn: &mut Connection = &mut conn;
        ic_sql_migrate::sqlite::up(conn, MIGRATIONS).unwrap();
    });
}

#[init]
fn init() {
    run_migrations();
}

#[pre_upgrade]
fn pre_upgrade() {
    close_connection();
}

#[post_upgrade]
fn post_upgrade() {
    run_migrations();
}

Turso Example:

use ic_cdk::{init, post_upgrade, pre_upgrade};
use turso::Connection;

static MIGRATIONS: &[ic_sql_migrate::Migration] = ic_sql_migrate::include!();

async fn run_migrations() {
    let mut conn = get_connection().await;
    ic_sql_migrate::turso::up(&mut conn, MIGRATIONS).await.unwrap();
}

#[init]
async fn init() {
    // Initialize storage
    run_migrations().await;
}

#[post_upgrade]
async fn post_upgrade() {
    // Re-initialize storage
    run_migrations().await;
}

Examples

Complete working examples are provided for both database backends:

:file_folder: SQLite Example - Advanced Database Operations

Showcases high-performance SQLite on ICP with the full Chinook database:

  • Complete Database: Imports the entire Chinook music store database (11 tables, thousands of records)
  • Complex Queries: Demonstrates advanced SQL operations including multi-table JOINs, aggregations, and analytics
  • Read Operations (test1-3):
    • Top customers analysis with purchase history
    • Genre and artist revenue analytics
    • Sales trends and employee performance metrics
  • Write Operations (test4-5):
    • Bulk invoice generation (250+ invoices with complex line items)
    • Massive playlist creation with track analytics and recommendations
  • Performance Tracking: Each operation reports instruction counts, demonstrating SQLite’s efficiency on ICP
  • Stress Testing: Operations designed to process thousands of records and complex transactions
cd examples/sqlite
dfx start --clean
dfx deploy
dfx canister call sqlite run        # Verify migrations
dfx canister call sqlite test1      # Top customers analysis
dfx canister call sqlite test2      # Genre/artist analytics
dfx canister call sqlite test3      # Sales trends
dfx canister call sqlite test4      # Bulk invoice generation
dfx canister call sqlite test5      # Playlist manipulation

The SQLite example proves that complex, production-grade databases can run efficiently on the Internet Computer, with operations processing thousands of records in a single call while tracking instruction usage.

:file_folder: Turso Example - Basic Migration Demo

Shows Turso database usage in an ICP canister:

  • Async migration execution
  • Stable memory persistence with WASI polyfill
  • Simple person table with basic operations
  • Suitable for simpler use cases (Turso’s SQL implementation is still evolving)
cd examples/turso
dfx start --clean
dfx deploy
dfx canister call turso run

Note: The SQLite example demonstrates significantly more advanced capabilities, making it the recommended choice for complex database operations on ICP.

Happy building!

:seedling:

6 Likes