I’ve recently started a new contract that have goal to get their database structure into Git. This is laudable!👏 (you have your code in Git, right?!?)
At Omnitech we have a long running Teams chat going on the difference between State and Migration based versioning of Databases (we are MS SQL centric thinkers). I’ve compiled the feedback and created 2 small demos for my own learning to share with you.
Visual Studio SQL Server Data Tools (SSDT) Database Projects
Works with Git and Pipelines for CI/CD
May run into interesting deployment and development issues
A more familiar approach for Developers
Good for simple single database maintained by Developers
You can see the full picture of what the database will be after changes are applied
Free!
SDK-style SQL Database Projects (General Availability in March 2025)
# install SqlPackage CLI
dotnet tool install -g Microsoft.SqlPackage
# install Microsoft.Build.Sql.Templates
dotnet new install Microsoft.Build.Sql.Templates
dotnet new sqlproj -n SqlSdkExample
# Create the .dacpac
dotnet build
# Deploy
sqlpackage /Action:Publish /SourceFile:yourfile.dacpac /TargetConnectionString:{yourconnectionstring}
# Generate the Script
sqlpackage /Action:Script /SourceFile:bin\Debug\SqlSdkExample.dacpac /TargetConnectionString:"Server=(localdb)\MSSQLLocalDB;Initial Catalog=SDKStyle" /OutputPath:script
Another option is to use Entity Framework Core with Migrations . Here is a good video explaining the process .
Which will you chose for your project?
# https://learn.microsoft.com/en-us/sql/tools/sql-database-projects/tutorials/create-deploy-sql-project?view=sql-server-ver17&pivots=sq1-visual-studio-sdk
trigger:
- main
pr:
- main
pool:
vmImage: "ubuntu-latest"
variables:
artifactName: "SqlSdkExample"
${{ if eq(variables['Build.SourceBranch'], 'refs/heads/main') }}:
isMain: "true"
${{ else }}:
isMain: "false"
sqlProj: "$(Build.SourcesDirectory)/Pipelines/SQL/SDK/SqlSdkExample/SqlSdkExample.sqlproj"
sqlPackageVersion: "170.0.94"
sqlPublishPac: "$(Build.ArtifactStagingDirectory)/${{ variables.artifactName }}/${{ variables.artifactName }}.dacpac"
stages:
- stage: Build
jobs:
- job: Build
displayName: "Build"
steps:
- task: UseDotNet@2
inputs:
packageType: "sdk"
version: "8.x"
- task: NuGetAuthenticate@1
displayName: "NuGet Authenticate"
- script: dotnet build ${{ variables.sqlProj }} -o $(Build.ArtifactStagingDirectory)
displayName: Build
- task: PublishBuildArtifacts@1
displayName: "Publish Build Artifacts"
condition: and(succeeded(), ${{ eq(variables.isMain, 'true') }})
inputs:
PathtoPublish: "$(Build.ArtifactStagingDirectory)"
artifactName: "${{ variables.artifactName }}"
- stage: PublishDev
condition: and(succeeded(), ${{ eq(variables.isMain, 'true') }})
dependsOn: Build
variables:
- group: Development-OnPremise
jobs:
- deployment: Deployment
workspace:
clean: all
environment:
name: 'Development-Database-OnPremise'
resourceType: VirtualMachine
strategy:
runOnce:
deploy:
steps:
- download: current
artifact: ${{ variables.artifactName }}
- checkout: none
- task: DownloadPipelineArtifact@2
inputs:
artifactName: ${{ variables.artifactName }}
targetPath: $(Build.ArtifactStagingDirectory)/${{ variables.artifactName }}
- script: dotnet tool install --global Microsoft.SqlPackage --version ${{ variables.sqlPackageVersion }}
displayName: "Install specific version of SqlPackage"
- script: |
mkdir "$(Build.ArtifactStagingDirectory)/reports/"
displayName: 'Create artifacts reports folder'
# These commands could be used, but /Action:Publish creates the report and the script so we don't have to
# The SqlPackage DeployReport action creates an XML report of the changes that would be made by a publish action.
# - script: sqlpackage /Action:DeployReport /SourceFile:"${{ variables.sqlPublishPac }}" /TargetConnectionString:"Server=$(DatabaseServer);Initial Catalog=Example" /OutputPath:"$(Build.ArtifactStagingDirectory)# \reports\deployReport.xml"
# displayName: 'Generate Deploy Report'
# The SqlPackage DriftReport action creates an XML report of the Schema changes that have been made to the registered database since it was last registered.
# this step didn't pass in the build, or output a file in a local CLI??
# - script: sqlpackage /Action:DriftReport /TargetConnectionString:"Server=$(DatabaseServer);Initial Catalog=Example" /OutputPath:"$(Build.ArtifactStagingDirectory)\reports\driftReport.xml"
# displayName: 'Generate Drift Report'
# The SqlPackage DriftReport action creates an XML report of the changes that have been made to the registered database since it was last registered.
#- script: sqlpackage /Action:Script /SourceFile:"${{ variables.sqlPublishPac }}" /TargetConnectionString:"Server=$(DatabaseServer);Initial Catalog=Example" /OutputPath:"$(Build.ArtifactStagingDirectory)\reports\driftReport.xml"
# displayName: 'Generate Script'
# $(DatabaseServer) is from the Variable Group
# /Action:Publish creates the report and the script
- script: sqlpackage /Action:Publish /SourceFile:"${{ variables.sqlPublishPac }}" /TargetConnectionString:"Server=$(DatabaseServer);Initial Catalog=Example" /DeployReportPath:"$(Build.ArtifactStagingDirectory)\reports\deployReport.xml" /DeployScriptPath:"$(Build.ArtifactStagingDirectory)\reports\changeScript.sql
displayName: "Publish"
- task: PublishPipelineArtifact@1
inputs:
targetPath: "$(System.DefaultWorkingDirectory)\reports\"
artifact: "Diagnostic Files"
publishLocation: "pipeline"
# sqlpackage /Action:Publish /SourceFile:bin/Debug/MyDatabaseProject.dacpac /TargetConnectionString:{yourconnectionstring}
Gave Microsoft Copilot and it gave me a helpful breakdown.
| Criteria | DbUp (Migration-based) | SSDT/Dacpac (State-based) |
|---|---|---|
| Change Tracking | Explicit, file-based migrations (e.g., 001_AddTable.sql) |
Implicit, based on schema diff |
| Version Control | Easy to track individual changes | Harder to trace specific changes |
| Deployment Order | Controlled via script order | Determined by schema comparison |
| Delayed Releases | Easy to cherry-pick and delay specific scripts | Difficult — requires managing schema drift |
| Rollback Strategy | Manual (write down scripts) | Not built-in, but can be scripted |
| Tooling Simplicity | Lightweight, flexible | Heavier tooling (SSDT, Visual Studio) |
| Cross-Repo Coordination | Needs naming conventions to avoid conflicts | Centralized schema avoids duplication |
| Linked Server Support | Works well with raw SQL | Often problematic or unsupported |
| SSIS Integration | Can be co-located with SSIS in same repo | Works well if DB project is in same solution |
| Environment Drift Detection | Manual | Built-in drift detection |
| Learning Curve | Lower for SQL developers | Higher due to tooling and concepts |
| Approval Workflow | PR-based, script reviewed directly | PR-based, but schema diff is abstracted |
| Production Safety | High control, but requires discipline | Safer with drift detection and validation |
| Variables/Templating | Supported (e.g., tokens in scripts) | Limited or complex to manage |
| Scenario | Recommended Approach | Why |
|---|---|---|
| Small, isolated DB changes tied to app features | DbUp | Easier to manage per-feature migrations |
| Large, centralized schema with multiple contributors | SSDT/Dacpac | Schema diffing helps avoid conflicts |
| Long-lived feature branches or delayed releases | DbUp | Cherry-picking scripts is easier |
| Projects with linked servers or raw SQL in SSIS | DbUp | More flexible and compatible |
| Projects already using SSDT successfully (e.g., BI team) | SSDT/Dacpac | Leverage existing tooling and experience |
| Need for drift detection or schema validation | SSDT/Dacpac | Built-in validation tools |
| Devs prefer SQL-first workflow | DbUp | More natural for SQL-centric teams |
Check out my Resources Page for referrals that would help me.