Automating SQL Server Deployments Across Multiple Databases Using Python

When managing multiple SQL Server environments—such as development, staging, and production—running the same deployment script manually across each instance is not only repetitive but also risky. Copy-pasting scripts into Management Studio, manually connecting to each server, or trying to remember passwords can introduce unnecessary human error. A better approach is to automate the process using a Python script and a shared SQL file. This allows you to define specific connection details for each environment and safely run the same deployment logic across all targets in a consistent, repeatable manner. Let us learn about Automating SQL Server Deployments Across Multiple Databases Using Python.

Automating SQL Server Deployments Across Multiple Databases Using Python deployments-800x533

I recently implemented this exact solution for one of my clients as part of my Comprehensive Database Performance Health Check service. Each environment—dev, staging, and production—had its own credentials and target database, and the Python-based deployment automation helped the team eliminate manual steps and reduce risk. It not only saved time during each release cycle but also gave the team more confidence in the consistency and safety of their deployments.

What You’ll Achieve

  • Automate database deployments to multiple SQL Servers
  • Use per-server credentials for authentication
  • Maintain a single, versioned SQL script for updates
  • Execute idempotent deployment logic safely and repeatably

Let’s look at the pieces that make this work.

Step 1: The SQL Deployment Script

This script checks if a table exists, creates it if needed, and inserts or updates a record with a deployment message. Save the following as deployment_script.sql:

-- Create or update a table
IF NOT EXISTS (
    SELECT * FROM sys.tables WHERE name = 'TestDeploy'
)
BEGIN
    CREATE TABLE TestDeploy (
        ID INT PRIMARY KEY,
        Message NVARCHAR(200)
    );
END
-- Insert or update sample data
MERGE INTO TestDeploy AS target
USING (SELECT 1 AS ID, 'Deployed successfully!' AS Message) AS source
ON target.ID = source.ID
WHEN MATCHED THEN
    UPDATE SET target.Message = source.Message
WHEN NOT MATCHED THEN
    INSERT (ID, Message) VALUES (source.ID, source.Message);

This script is safe to run multiple times and won’t fail if the table or row already exists.

Step 2: The Python Deployment Script

This Python script connects to each SQL Server instance using pyodbc and applies the SQL script. Each server uses its own credentials. Save this as multi.py:

import pyodbc
import time

# Configuration
sql_script_path = "deployment_script.sql"  # Path to your .sql file

sql_servers = [
    {
        "name": "SQLAuthority",
        "server": "localhost",
        "database": "SQLAuthority",
        "username": "sa",
        "password": "password"
    },
    {
        "name": "SQLAuthority2",
        "server": "192.168.1.1",
        "database": "SQLAuthority2",
        "username": "user",
        "password": "pass"
    },
    # Add more databases as needed
]

def run_script_on_server(config, script):
    print(f"\nDeploying to {config['name']}...")

    conn_str = (
        f"DRIVER={{ODBC Driver 17 for SQL Server}};"
        f"SERVER={config['server']};"
        f"DATABASE={config['database']};"
        f"UID={config['username']};"
        f"PWD={config['password']};"
    )

    try:
        with pyodbc.connect(conn_str, autocommit=False) as conn:
            cursor = conn.cursor()
            cursor.execute(script)  # Run full script as a batch
            conn.commit()
        print(f"Deployment successful for {config['name']}")
    except Exception as e:
        print(f"Deployment failed for {config['name']}: {e}")

def load_sql_script(path):
    with open(path, "r", encoding="utf-8") as f:
        return f.read()

if __name__ == "__main__":
    print("Starting deployment using per-server SQL credentials...")

    sql_script = load_sql_script(sql_script_path)

    for config in sql_servers:
        run_script_on_server(config, sql_script)
        time.sleep(1)

    print("\nAll deployments complete.")

How to Use It

  1. Install the required Python package:
    pip install pyodbc
  2. Ensure the ODBC Driver 17 for SQL Server is installed on your system.
  3. Place both multi.py and deployment_script.sql in the same directory.
  4. Update the sql_servers list with the connection details for each target database.
  5. Run the script:
    python multi.py
  6. The script will read the SQL file once and deploy it to each listed server in sequence.

Why This Setup Works Well

  • Centralized Logic: One SQL file drives all deployments. Easy to version and review.
  • Per-Server Credentials: No assumption of shared passwords or roles.
  • Safe to Re-run: The SQL script is written to be idempotent.
  • Expandable: Easily adapt for logging, notifications, or integration into a CI/CD pipeline.

Automating SQL Server Deployments Across Multiple Databases Using Python Arc

Closing Thoughts

This approach provides a clean, reproducible way to push consistent changes across multiple SQL Server databases. Whether you’re rolling out schema updates, seeding data, or applying patches, the setup scales with you.

You can further enhance it by reading server configs from a JSON or YAML file, logging deployment results, or tagging each run with a version or timestamp.

Use this as a starter template to bring more consistency, safety, and speed to your database operations. Connect with me on Twitter.

Reference: Pinal Dave (https://blog.sqlauthority.com)

Previous Post
SQL Server Performance Tuning in the Age of AI: An Expert’s Tale

Related Posts

No results found.

Leave a Reply