Script and Alter SQL Server Database Objects with PowerShell to Correct Collation Issues

By:   |   Updated: 2015-04-14   |   Comments   |   Related: 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | > SQL Server Configurations


Problem

Recently one of our customers experienced an Infor ERPLN upgrade from version 10.3.1 (FP9) to version 10.4 (FP10) which went terribly wrong. The upgrade took place on a development server which has multiple versions of the software installed side by side and involved only the component called Enterprise Server ("Tools"), which provides the backend of the application and the proprietary software development environment. The application uses SQL Server 2012 SP1 as database backend. After the upgrade the administrative portal of the application became inaccessible due to SQL Server collation issues at the column level.

Solution

A quick view of the Windows Application log revealed many database related error messages about collation conflicts between columns involved in joins: "Cannot resolve the collation conflict between "Latin1_General_BIN2" and "Latin1_General_BIN" in the equal to operation". The SQL Server default collation according to the Infor ERPLN installation requirements is Latin1_General_BIN. Following the upgrade the collation of several columns have been changed to Latin1_General_100_BIN2. The first step I took was to find out how many tables and columns were affected. The scripts below return the same result (tables affected by the collation change) using the [INFORMATION_SCHEMA].[COLUMNS] view and the other a query of the system catalog views :

-- tables affected

SELECT DISTINCT TABLE_NAME
FROM [db_name].[INFORMATION_SCHEMA].[COLUMNS]
WHERE COLLATION_NAME = 'Latin1_General_100_BIN2'

SELECT DISTINCT o.name AS table_name
FROM [db_name].[sys].[objects] o 
JOIN [db_name].[sys].[columns] c ON o.object_id = c.object_id
WHERE o.type = 'U' AND c.collation_name = 'Latin1_General_100_BIN2'

Some outstanding SQL Server professionals believe that the INFORMATION_SCHEMA views should not be used any more. However, in this case the result sets are identical. If you check the [INFORMATION_SCHEMA].[COLUMNS] underlying code (with sp_helptext) you see that this view uses the same join between sys.objects and sys.columns on object_id. With SET STATISTICS IO ON you'll see that for both scripts SQL Server performs the same number of scans and logical reads to sys.sysschobjs and sys.syscolpars.

To find out how many columns ended up with changed collation, I've run:

-- columns affected

SELECT COLUMN_NAME
FROM [db_name].[INFORMATION_SCHEMA].[COLUMNS]
WHERE COLLATION_NAME = 'Latin1_General_100_BIN2'

SELECT c.name AS column_name
FROM [db_name].[sys].columns c
WHERE c.collation_name = 'Latin1_General_100_BIN2'

This time the script using [INFORMATION_SCHEMA].[COLUMNS] is more "expensive", since behind the scenes it uses a join between sys.objects and sys.columns.

Regardless the script, the result was that 4703 columns from 715 tables ended up with a collation different from the default database collation. For each "damaged" column all I had to do was to run:

ALTER TABLE  ALTER COLUMN   COLLATE Latin1_General_BIN

Well, I was a little bit wrong. Because most of the affected columns were part of index definitions, the result was an error message:

Msg 5074, Level 16, State 1, Line 1
The index  is dependent on column .
Msg 4922, Level 16, State 9, Line 1
ALTER TABLE ALTER COLUMN Comments failed because one or more objects access this column.

Since manually dropping and re-creating indexes and constraints for hundreds of tables and "fixing" thousands of columns was not an option I came up with a PowerShell script using both SMO and Invoke-SqlCmd.

    try{
    $ErrorActionPreference = "Stop"
  
    [System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SMO') | out-null 
    Import-Module SQLPS -DisableNameChecking
    
    $loc = "C:\PowershellScripts"
    Set-Location $loc;
    #location of the output files
    $alter_column_file = "alter_column_statements.sql"
    $create_index_file = "create_index.sql"
    $drop_index_file = "drop_index.sql"
    $table_source = "bad_tables.txt"
    #location of the input files - TSQL scripts I'm using to obtain a list of the tables and columns affected
    $bad_tables_tsql = "bad_tables.sql"
    $alter_column_tsql = "alter_column.sql"
    
    #initial content of the output files
    Set-Content $alter_column_file "--alter column"
    Set-Content $create_index_file "--create index"
    Set-Content $drop_index_file "--drop index"
    Set-Content $table_source ""
    
    #setting up the SQL context
    $server = new-object ('Microsoft.SqlServer.Management.Smo.Server') 'MyServer' 
    if($server.Databases.Count -le 0){Write-Host -ForegroundColor Yellow "There are no databases. Check the server name."; Exit}
    $database = $server.Databases["db_name"]
    if($database.Tables.Count -eq 0){Write-host -ForegroundColor Yellow "There are no tables in this db. Check the db name"; Exit}
   
    #scripter object I'll use later to script the inexes for each table
    $scrp = New-Object Microsoft.SqlServer.Management.Smo.Scripter($server)
                
    $sb_create_index = New-Object System.Text.StringBuilder
    $sb_drop_index = New-Object System.Text.StringBuilder
    
    #column list
    $sqlcheck = Invoke-sqlcmd -InputFile $alter_column_tsql
    if($sqlcheck -eq $null){Write-Host -ForegroundColor Green "There are no columns to alter"; Exit}
    $sqlcheck | Out-File $alter_column_file
    #tables list
    Invoke-Sqlcmd -InputFile $bad_tables_tsql | Out-File -filePath $table_source
    
    #scripting the indexes for the "damaged" tables
    foreach($table in Get-Content $table_source | Select-Object -Skip 3){
        $currenttable = $database.Tables.Item($table.Trim())
        #Write-Host $currenttable
       foreach($index in $currenttable.Indexes){
           #drop indexes and constraints
           $scrp.Options.IncludeIfNotExists=$True
           $scrp.Options.ScriptDrops = $True
           foreach ($sd in $scrp.Script($index)){
                $sb_drop_index.Append($sd + "`r`n GO `r`n") | Out-Null
            }
           #create indexes and constraints again
           $scrp.Options.IncludeIfNotExists=$True
           $scrp.Options.ScriptDrops = $False
           foreach ($sc in $scrp.Script($index)){
                $sb_create_index.Append($sc + "`r`n GO `r`n") | Out-Null
            }
        }
    }
    
    $sb_create_index.ToString() | Add-Content $create_index_file
    $sb_drop_index.ToString() | Add-Content $drop_index_file

}
catch{
    Write-Host -ForegroundColor Magenta $Error[0].Exception
    while($Error[0].InnerException){
        Write-Host $Error[0].InnerException.Message
    }
   }
 

First I'm using two T-SQL scripts to obtain a list of the "damaged" tables ($bad_tables_tsql) and columns ($alter_column_tsql). I've discussed the scripts earlier. It's OK to use either the INFORMATION_SCHEMA views or the system catalog views - both versions will return the expected result. The second scripts generates the ALTER TABLE - ALTER COLUMN statements that I'll write down in one of the output files. Here are the scripts:

-- tables
SELECT DISTINCT TABLE_NAME
FROM [db_name].[INFORMATION_SCHEMA].[COLUMNS]
WHERE COLLATION_NAME = 'Latin1_General_100_BIN2'

-- columns
-- It's mandatory to specify the column data type.
-- The ERPLN database does not use nullable columns; therefore I've specified NOT NULL in each statement to preserve the column's nullability.
SELECT
'ALTER TABLE ' + TABLE_NAME + ' ALTER COLUMN ' + COLUMN_NAME + ' ' + DATA_TYPE  +'(' + CAST(CHARACTER_MAXIMUM_LENGTH AS varchar(5)) + ')' + ' COLLATE Latin1_General_BIN' + ' NOT NULL'
FROM [db_name].[INFORMATION_SCHEMA].[COLUMNS]
WHERE COLLATION_NAME = 'Latin1_General_100_BIN2'

I'm calling these scripts with Invoke-SQLCmd and I'm storing the list of the affected tables in a text file ($table_source) and the ALTER TABLE statements in a .sql file ($alter_column_file). Below you can see a sample of each file:

-- $table_source - bad_tables.txt
-- noice the header, that's the reason for the Select-Object -Skip 3 condition
TABLE_NAME                                                                                                                                                                  
----------                                                                                                                                                                  
ttlanl100000                                                                                                                                                                
ttlanl103000                                                                                                                                                                
ttlbct350000
.
.
.

--$alter_column_file - alter_column_statements.sql
Column1                                                                                                                                                                     
-------                                                                                                                                                                     
ALTER TABLE tttspt101000 ALTER COLUMN t_user varchar(12) COLLATE Latin1_General_BIN NOT NULL                                                                               
ALTER TABLE tttspt101000 ALTER COLUMN t_pacc varchar(8) COLLATE Latin1_General_BIN NOT NULL                                                                                
ALTER TABLE tttspt101000 ALTER COLUMN t_comp varchar(10) COLLATE Latin1_General_BIN NOT NULL 
.
.
.

Next I'm iterating through the tables listed in bad_tables.txt and, for each table and for each index I'm generating the "drop" and "create" statements. This version of the ERPLN database uses only primary key constraints. Otherwise, besides indexes, I should have scripted other constraints such as defaults and foreign keys.

I'm using a scripter object and its Script method to generate statements for each index. The Scripter object offers a wealth of helpful options - for example here I'm first setting the ScriptDrops option to "true" to generate the "drop index" statements and afterwards to "false" to generate the "create index" script. I've run the output T-SQL scripts (alter_column_statements.sql, drop_index.sql, create_index.sql) from within SSMS, but they can be run from within PowerShell as well.

Finally, please notice a few performance related tricks:

  • Instead of iterating through all the database's user tables and performing the operations I need (drop indexes, alter columns, create indexes) I've chose to obtain first a list of "damaged" tables via a T-SQL script and afterwards generating "drop" and "create" statements for the indexes based on this list (bad_tables.txt). I've worked on a database having ~100,000 user tables all located in the "dbo" schema and iterating through all of them would have lasted for hours. If I'm using these intermediary files, the script does its job in ~ 10 minutes in my development environment, i.e. Windows 8.1 Pro x64 I5-33171 (2 cores) 1.7 GHz 4GB RAM. I've got the tables and columns list in a few seconds through T-SQL scripts and most of the work consisted in iterating through the 715 tables and generating the "drop" and "create" statements.
  • I'm storing each string of the collection returned by the Script method in a string builder ($sb_create_index, $sb_drop_index) which finally I'm writing to a file on disk.
Next Steps


sql server categories

sql server webinars

subscribe to mssqltips

sql server tutorials

sql server white papers

next tip



About the author
MSSQLTips author Diana Moldovan Diana Moldovan is a DBA and data centric applications developer with 6 years of experience covering SQL 2000, SQL 2005 and SQL 2008.

This author pledges the content of this article is based on professional experience and not AI generated.

View all my tips


Article Last Updated: 2015-04-14

Comments For This Article

















get free sql tips
agree to terms