Snowflake CI/CD with schemachange Using GitHub Actions

Spread the love

1. Introduction

In the previous article, we covered how to set up schemachange locally configuring connection files, organizing SQL scripts, and manually deploying changes to dev/uat/prod. That manual process works well for small teams or one-off tasks.

But in modern data engineering, teams often prefer fully automated pipelines triggered by code changes, auditable, repeatable, and integrated with version control. That’s where GitHub + GitHub Actions comes in.

In this article, we will show how to integrate schemachange with GitHub Actions, so every time you push or merge SQL changes into your repo, deployment to Snowflake happens automatically without manual intervention.

2. Why Automate via GitHub Actions?

  • Version control: All SQL scripts live in Git. Changes are tracked, reviewed via pull requests, and merged.
  • Automated, repeatable deployments: No manual CLI runs. Deployments are triggered automatically (on push, PR merge, manual dispatch).
  • Environment Consistency & Auditability: You avoid human error (scripts run in wrong order, skipped scripts), and every deployment is logged. The scripts are executed in same order and with same dependencies across all environments.
  • CI/CD friendly: Aligns database deployments with application code CI/CD practices and fits in DevOps workflows.

3. Prerequisites

Before setting up GitHub Actions, make sure we have:

  • A GitHub repository to place your schemachange project’s SQL scripts, config files, etc.
  • The necessary Snowflake credentials (account, user, role, warehouse, etc.)  to be stored securely as GitHub Secrets and Variables for each environment separately.
  • Basic familiarity with Git, branching, pull requests, and GitHub Actions setup.

4. Repository Structure for schemachange Automation

Below is the folder structure to be setup in the GitHub repository.

(root)
├── .github
│   └── workflows
│       ├── schemachange_dev.yml
│       ├── schemachange_uat.yml
│       └── schemachange_prod.yml
├── schemachange
│   ├── db_scripts
│   │   ├── V0
│   │   │   ├── V0_1__create_table.sql
│   │   │   └── V0_2__alter_table.sql
│   │   └── V1
│   │       └── V1_1__insert_records.sql
│   └── schemachange-config.yml
└── README.md

The .github/workflows directory contains the CI/CD pipeline definitions for each environment. We will discuss these workflow files in detail in the next sections.

The schemachange folder mirrors the structure we created during the manual setup. However, unlike the manual process, there is no connections.toml file in the repository. In the automated setup, this file is generated dynamically during the GitHub Actions workflow using Secrets and environment variables.

5. Setting Up GitHub Environments, Secrets and Variables

To support multi-environment CI/CD (DEV, UAT, PROD), create three Environments in your GitHub repo by navigating to Settings → Environments → New Environment

  • dev
  • uat
  • prod

Each environment will contain its own Snowflake-specific Secrets and Variables, such as:

Secret / VariableDescription / Example Value
SF_PASSWORDYour_password
ENVdev / uat / prod
SF_ACCOUNTxy12345.ap-southeast-1
SF_DATABASEDEMO_DB_DEV / DEMO_DB_UAT / DEMO_DB_PROD
SF_ROLESYSADMIN
SF_USERSFUSER29
SF_WAREHOUSECOMPUTE_WH
GitHub Secrets and Variables

These secrets and variables are injected securely into the GitHub Actions workflow and are used at runtime to generate the temporary connections.toml file that schemachange requires for deploying changes.

6. GitHub Actions Workflow

To create a new GitHub Actions workflow, navigate to Actions → New workflow

This opens an editor where you can add your workflow definition in a file under .github/workflows/ directory. When you save the file, GitHub will automatically create the required folder structure if not exists already.

Create your first workflow file named schemachange_dev.yml, paste the workflow configuration, and save it.

name: schemachange_dev
on:
  push:
    branches: [ "dev" ]
  workflow_dispatch:

jobs:
  deploy_Snowflake_Scripts:
    runs-on: ubuntu-latest
    environment: dev  

    env:
      ENV: ${{ vars.ENV }}

    steps:
      - name: Checkout repository
        uses: actions/checkout@v4

      - name: Set up Python
        uses: actions/setup-python@v5
        with:
          python-version: '3.11'

      - name: Install schemachange
        run: pip install schemachange==4.0.1

      - name: Create connections.toml
        run: |
          mkdir -p ./schemachange/connection
          cat <<EOF > ./schemachange/connection/connections.toml
          [${{ vars.ENV }}]
          account = "${{ vars.SF_ACCOUNT }}"
          user = "${{ vars.SF_USER }}"
          password = "${{ secrets.SF_PASSWORD }}"
          role = "${{ vars.SF_ROLE }}"
          warehouse = "${{ vars.SF_WAREHOUSE }}"
          database = "${{ vars.SF_DATABASE }}"
          EOF

      - name: Run schemachange
        run: |
          cd ./schemachange
          schemachange deploy --config-folder . --verbose

      - name: Clean up connections.toml
        if: always()
        run: rm -f ./schemachange/connection/connections.toml

  Repeat the same process to create two additional workflow files:

  • schemachange_uat.yml for UAT deployments
  • schemachange_prod.yml for PROD deployments

Each workflow will use environment-specific secrets and variables to deploy changes to the appropriate Snowflake environment.

7. How the Workflow Works?

This workflow forms the core of the automation pipeline. Below is a quick breakdown of how each part functions.

Workflow Configuration

  • name: Assigns a readable name to the workflow.
  • on: Defines when the workflow should run.
    • push → triggers automatically when changes are pushed to the main branch.
    • workflow_dispatch → allows manual execution from the GitHub Actions UI.
  • jobs.deploy: The job that performs the actual deployment steps.
  • runs-on: Specifies the GitHub runner (Ubuntu machine).
  • environment: Determines the GitHub Environment (dev, uat, prod).

Workflow Steps

The steps section defines the sequence of actions GitHub performs to deploy Snowflake changes using schemachange:

  • Checkout repository
    Fetches the latest code, including SQL scripts and configuration files.
  • Set up Python
    Installs the required Python version on the GitHub-hosted runner.
  • Install schemachange
    Installs the schemachange package so it can be used during the workflow.
  • Create temporary connections.toml
    Generates the Snowflake connection file at runtime using GitHub Secrets.
    Credentials never exist in the repository—they are created temporarily and discarded after the workflow completes.
  • Run schemachange deploy
    Executes schemachange with the generated connection details and configuration.
    This step applies all pending SQL scripts to the target Snowflake environment.

8. Updating schemachange-config.yml for GitHub Automation

The schemachange-config.yml file must be updated to work seamlessly within the GitHub Actions pipeline.
Below is an configuration adapted for automated deployments:

config-version: 1

# Root folder for SQL change scripts
root-folder: './db_scripts'

# Folder for Jinja macros/templates (null if not used)
modules-folder: null

# Path to connections.toml created by the workflow
connections-file-path: './connection/connections.toml'

# Connection section in TOML (matches ENV)
connection-name: "{{ env_var('ENV') }}"

# Change history table (uses ENV)
change-history-table: "DEMO_DB_{{ env_var('ENV') }}.SCHEMACHANGE.CHANGE_HISTORY"

# Variables for SQL scripts
vars:
  env: "{{ env_var('ENV') }}"
  database: "DEMO_DB_{{ env_var('ENV') }}"

# Create change history table if not exists
create-change-history-table: true

# Enable autocommit for DML commands
autocommit: false

# Verbose logging
verbose: true

# Run actual changes
dry-run: false

# Query tag for tracking
query-tag: 'QUERY_TAG'

Key Points to Note

  • ENV is supplied by GitHub Secrets, making the configuration dynamic across dev, uat, and prod.
  • The workflow generates connections.toml at runtime, and the connections-file-path points to that temporary file.
  • Database names, change history tables, and variables are dynamically resolved using environment variables, allowing the same configuration file to work across all environments.

9. End-to-End Deployment Flow

The complete CI/CD flow using schemachange and GitHub Actions works as follows:

  1. Developer commits a new SQL script to the repository.
  2. A Pull Request is created for code review.
  3. Reviewer approves the changes, ensuring controlled and auditable deployments.
  4. GitHub Actions workflow is triggered automatically (or manually for higher environments).
  5. GitHub generates a temporary connections.toml using environment-specific Secrets.
  6. schemachange deploys the changes to Snowflake, executing only pending scripts.
  7. The CHANGE_HISTORY table is updated, logging every executed script for traceability.
  8. Deployments to UAT/PROD require approval, providing an additional governance layer.

10. Conclusion

By integrating schemachange with GitHub Actions, we’ve elevated the manual deployment process from Part 1 into a fully automated, reliable CI/CD pipeline. This approach delivers several key benefits:

  • Secure credential handling through GitHub Secrets
  • Controlled and safe deployments with environment-based approvals
  • Consistent DEV → UAT → PROD promotions using standardized workflows
  • Full traceability via the CHANGE_HISTORY table in Snowflake
  • Minimal manual effort, as deployments occur automatically after merging code

Subscribe to our Newsletter !!

Related Articles:

Leave a Comment

Related Posts