SQL SERVER – How to Change Compatibility of Database to SQL Server 2014

Yesterday I wrote about how we can install SQL Server 2014. Right after the blog post was live, I received a question from the developer that he has installed SQL Server 2014 and attached a database file from previous version of SQL Server. Right after attaching database, he was not able to work with the latest features of Cardinality Estimation. As soon as he sent me email I realize what has happened exactly. When he attached database, the database compatibility was set to still of the earlier version of SQL Server. To use most of the latest features of SQL Server 2014, one has to change the compatibility level of the database to the latest version (i.e. 120).

Here are two different ways how we can change the compatibility of database to SQL Server 2014’s version.

1) Using Management Studio

For this method first to go database and right click over it. Now select properties.

On this screen user can change the compatibility level to 120.

2) Using T-SQL Script.

You can execute following script and change the compatibility settings to 120.

USE [master] GO
ALTER DATABASE [AdventureWorks2012] SET COMPATIBILITY_LEVEL = 120
GO

Well, it is that easy :-)

Reference: Pinal Dave (https://blog.sqlauthority.com)

Previous Post
SQL Authority News – Presenting at SQL Bangalore on May 3, 2014 – Performing an Effective Presentation
Next Post
SQL SERVER – What is SSAS Tabular Data Model and Why to Use It – Part 2

Related Posts

No results found.

15 Comments. Leave new

  • jamessqlanderson
    May 7, 2014 4:45 pm

    Hi Pinal

    Whenever doing this it is a good idea to update the statistics for the indexes. I use the following script:

    DECLARE @SQL NVARCHAR(MAX), @DBName NVARCHAR(MAX)
    DECLARE @DBs TABLE
    (DBName NVARCHAR(100))
    INSERT INTO @DBs
    SELECT [name] FROM sys.databases
    WHERE database_id > 4 AND compatibility_level ‘110’ and [name] = ”

    WHILE EXISTS(SELECT * FROM @DBs)
    BEGIN
    SELECT TOP 1 @DBName = DBName FROM @DBs

    SET @SQL = ‘ALTER DATABASE [‘ + @DBName + ‘] SET COMPATIBILITY_LEVEL = 110;
    USE [‘ + @DBName + ‘]
    DECLARE @SQL NVARCHAR(MAX) = N””;

    Declare @Tables table
    ([Schema] nvarchar(50)
    ,[TableName] nvarchar(100))

    Insert into @Tables
    Select QUOTENAME(SCHEMA_NAME(schema_id)),QUOTENAME(name)
    FROM sys.tables;

    Declare @Schema nvarchar(50), @TableName nvarchar(100)

    While Exists(Select * From @Tables)
    Begin
    Select Top 1 @Schema = [Schema], @TableName = [TableName] From @Tables
    Set @SQL = ”UPDATE STATISTICS ” + @Schema + ”.” + @TableName + ” WITH FULLSCAN;”

    Begin Try
    EXEC SP_ExecuteSql @SQLToExecute = @SQL
    Print ”Completed: ” + @SQL
    End Try
    Begin Catch
    DECLARE @ErrMsg nvarchar(4000)
    SELECT @ErrMsg = SubString(ERROR_MESSAGE(),0,900)

    Select GetDate(), ”Failed updating stats on ” + @Schema + ” ” + @TableName + ”. Error: ” + @ErrMsg
    End Catch

    Delete From @Tables Where [Schema] = @Schema and [TableName] = @TableName
    End’
    BEGIN TRY
    EXEC sp_executesql @SQL
    PRINT ‘Upgraded ‘ + @DBName
    END TRY
    BEGIN CATCH
    DECLARE @ErrMsg nvarchar(4000), @ErrSeverity int
    SELECT @ErrMsg = SubString(ERROR_MESSAGE(),0,900)
    END CATCH

    DELETE FROM @DBs WHERE DBName = @DBName
    END

    Reply
  • Under what circumstances would SQL Server automatically change the compatibility level ?
    I am currently working on SQL Server 2008 and there is a user database which has a compatibility level of 80. Apparently this database was moved to this server prior to my time, and the compatibility level was never changed/upgraded. When I execute the following script, it is set successfully to 100, however, the next day, it goes back to 80. Any ideas ?

    USE [master]
    GO
    ALTER DATABASE [MyDB]
    SET COMPATIBILITY_LEVEL = 100
    GO

    There is only a SQL Maintenance plan to perform integrity check, full database backups / transaction log backups, index rebuild/update stats on this instance.

    Reply
    • SQL Server doesn’t do that automatically. There must be some automated activity doing it. I would suggest to check ERRORLOG

      https://blog.sqlauthority.com/2015/03/24/sql-server-where-is-errorlog-various-ways-to-find-its-location/

      Reply
      • Hi Pinal, Thanks, but the ERRORLOG did not reveal anything relating to Compatibility Level, which is what I performed searches for and was looking through the logs. Is there something in particular that would need to show up there that would indicate that the compatibility level changed for some DB ?

      • I take that back. I was able to find a trace in the ERRORLOG as you suggest, but it only has the spid, and no login information.

        Line 302340: 2015-04-15 22:37:38.28 spid92 Setting database option COMPATIBILITY_LEVEL to 80 for database

        What table would I put my trace on ? and What filter can I enable to see who / what process is executing this query ?

  • So how to check what “automated activity” is modifying this value ?

    Reply
    • Is this happening at a particular time in ERRORLOG like between 10 PM to 11 PM? Errorlog would not show “who” is doing it.
      you can create below DDL trigger to get more information in ERRORLOG

      IF EXISTS (SELECT * FROM sys.server_triggers
      WHERE name = ‘ddl_trig_database’)
      DROP TRIGGER ddl_trig_database
      ON ALL SERVER;
      GO
      CREATE TRIGGER ddl_trig_database
      ON ALL SERVER
      FOR ALTER_DATABASE
      AS
      declare @cmd varchar(200)
      declare @who varchar(200)
      declare @fromwhere varchar(200)
      SELECT @cmd = EVENTDATA().value(‘(/EVENT_INSTANCE/TSQLCommand/CommandText)[1]’,’nvarchar(max)’)
      SELECT @who = EVENTDATA().value(‘(/EVENT_INSTANCE/LoginName)[1]’,’nvarchar(max)’)
      SELECT @fromwhere = EVENTDATA().value(‘(/EVENT_INSTANCE/ServerName)[1]’,’nvarchar(max)’)

      RAISERROR (N’!!!Database ALTERED!!!
      Command – %s
      By – %s
      From – %s.’, — Message text.
      10,
      1,
      @cmd,
      @who,
      @fromwhere) with log
      GO

      /*
      DROP TRIGGER ddl_trig_database
      ON ALL SERVER;
      GO
      */

      Reply
  • I have a huge DB with lots of tables, Stored Procedures and Functions. I want to change the compatibility level. But i’m afreaid if something will stop working. Is there an easy way to know if everything will work as expected? Or what I have to change in order to work properly?

    Reply
  • Hi .
     I need help!
    I have a file .mdf of the version SQL 2008 R2, and need to do atacch on an instance of SQL 2014, i used the emergency mode ,single user and DBCC But when i ‘m in Step That tells me : Cannot open database ‘MyBD’ version 661. Upgrade the database to the latest version.

    Reply
  • So, we’ve got really bad performance in our database after changing compatibility level into 120. Many of queries didn’t work as fast as before change. We don’t know what is the reason maybe some bug or something like this. We didn’t find any information about problems with performance after changing compatibility level. Maybe becouse of changes with cardinality estimator in sql server 2014 we had some problems with existing queries.

    Reply
    • Hi Jaho,

      I understand your point. I have seen in quite a lot in my consulting work.

      You can follow the systemic process and eliminate slowness and ultimately you will end up with an amazing performance.

      Reply
  • Hi Pinal, but we still don’t know what is that reason of poor performance after change. We have a lot of databases on server with a lot of data and db objects. Can it be connected with old query execution plans which were created in previous compatibility level and needed to be recompile or recreated? Have you got any clue what is the reason of strange behaviour after doing changes? After doing changes processes were using all of our server resources.

    Reply
    • Hi Jaho,

      There can be many different things – indexes, tempdb issues, IO distribution or your resource consumption is changed because of this upgrade along with cardinality. It is very common and I often see this at many different customers. It takes a few changes in how SQL Server configuration and the performance usually comes back.

      I will be not able to write each of them here as it requires inspection of your system but trust me see if you can take deeper look at your system in above listed topics.

      Reply
  • David, do I have to change Master DB Compatibility level to 120? We recently upgrade SQL Server from 2008 to 2014. all system database (model,msdb,tempdb) was set to 120 by default but master db is set to 100
    Thank you

    Reply

Leave a ReplyCancel reply

Exit mobile version