SQL SERVER – FIX – Error: 905, Severity: 21, State: 1 – Database ‘xxx’ cannot be started in this edition of SQL Server because it contains a partition function ‘xxx’.

Recently one of my blog reader contacted me via mail and told that after upgrading from Enterprise Evaluation Edition to Standard Edition, the database was not accessible. To troubleshoot, I have asked them to share SQL Server ERRORLOG. There are various ways to read Errorlog. Here is the method to read Errorlog:

  • In Object Explorer, expand a server, expand Management, and then expand SQL Server Logs.
  • Right-click a log and click View SQL Server Log.

Other ways to get Errorlog are explained in detail by Balmukund (b|t) on his blog (Help : Where is SQL Server ErrorLog?)

In the Errorlog, I asked him to look for any error related to the database which is not accessible and soon they shared below messages:

2015-02-20 13:47:36.65 spid7s      Error: 905, Severity: 21, State: 1.
2015-02-20 13:47:36.65 spid7s      Database ‘My_Database’ cannot be started in this edition of SQL Server because it contains a partition function ‘myRangePF1′. Only Enterprise edition of SQL Server supports partitioning.
2015-02-20 13:47:36.65 spid7s      Error: 933, Severity: 21, State: 1.
2015-02-20 13:47:36.65 spid7s      Database ‘My_Database’ cannot be started because some of the database functionality is not available in the current edition of SQL Server.

Explanation

From the error we can see that the database is using partition function which is enterprise only feature. Here is the link which has edition comparison.  The database contains one or more partitioned tables or indexes. Standard edition of SQL Server cannot use partitioning. Therefore, the database cannot be started correctly. Partitioned tables and indexes are not available in every edition of Microsoft SQL Server. Books online topic “Features Supported by the Editions of SQL Server 2012” has complete list of feature.

Solution

I asked him to restore the database MDF and LDF files to another instance which is either developer, enterprise or enterprise evaluation edition. Then get rid of the features which are enterprise only feature. They can easily be found using the DMV sys.dm_db_persisted_sku_features

USE <DatabsaeName> -- Change the name to the database where we need to check
GO
SELECT *
FROM sys.dm_db_persisted_sku_features

Note that the query has to be run in the database which is under question, not the master database. Here are the possible values on SQL Server 2014.

  • TransparentDataEncryption
  • Compression
  • ChangeCapture
  • ColumnStoreIndex
  • InMemoryOLTP
  • Partitioning

Once the object referring to that feature is removed, the backup can be taken and restored on “lower” edition of SQL Server.

Here is another error which can be received while restoring database using compression

Msg 909, Level 21, State 1, Line 1
Database ‘SqlAuthority’ cannot be started in this edition of SQL Server because part or all of object ‘PageCompressionTest’ is enabled with data compression or vardecimal storage format. Data compression and vardecimal storage format are only supported on SQL Server Enterprise Edition.

I always advice to run the DMV and check the features whenever there is a downgrade of edition. Even though evaluation to standard is an upgrade but I would call it as downgrade because there are few feature of enterprise editions would not be available in standard edition.

Have you ever encountered any downgrade scenarios and got some errors?

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Finding Top Offenders in SQL Server 2012 – Notes from the Field #068

[Note from Pinal]: This is a 68th episode of Notes from the Fields series. Performance tuning gets easier with SQL Server 2012. For the first time, you can capture a workload and find your top offenders inside of Management Studio without having to write a single line of T-SQL or use any third party tools.

In this episode of the Notes from the Field series database expert John Sterrett (Group Principal at Linchpin People) explains a very common issue DBAs and Developer faces in their career – how to find top offenders in SQL Server. Linchpin People are database coaches and wellness experts for a data driven world. Read the experience of John in his own words.


One of the biggest mistakes with performance tuning is not knowing where to start.  Sometimes performance tuning can be like finding a needle in a haystack.  This is because you might not have a great process to help you find your top offenders. In this short five minute video below you will be able to utilize extended events to aggregate and group completed statements so you can find your top offenders without having to write any T-SQL code.

Are your servers running at optimal speed or are you facing any SQL Server Performance Problems? If you want to get started with the help of experts read more over here: Fix Your SQL Server.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Notes and Observations on ReadOnly Databases in SQL Server

In the past couple of weeks, I have written few blogs that revolve around utilizing “readonly” databases. It was fun working on this special case scenario that I had stumbled upon an interesting set of questions from my blog readers. These were not clearly called out in my previous blogs, hence thought of writing them down the responses I had given with examples in this blog. Some of the questions I have been asked were:

  • Can I create temporary tables or global temporary tables when working with ReadOnly databases?
  • Since the database is in ReadOnly mode, we cannot insert any values into the database. How about creation of tables when the DB is marked as ReadOnly?
  • Can we create stored procedures when working with ReadOnly Databases?
  • I have a large database and I see a lot of free space, can I shrink the DB when it is marked as ReadOnly? Is this allowed?

Some of these questions are interesting and require a small script to validate our understanding. A rule of thumb at this point is, this is a ReadOnly database and we need to know we cannot do anything with this database. Temp tables are created in the context of tempdb database and the readonly attribute doesn’t apply to objects created that way.

Next let me build the script to demystify each of the questions asked above:

CREATE DATABASE [ReadOnlyDB]
CONTAINMENT
= NONE
ON  PRIMARY
( NAME = N'ReadOnlyDB', FILENAME = N'C:\Temp\ReadOnlyDB.mdf' , SIZE = 4024KB , FILEGROWTH = 1024KB )
LOG ON
( NAME = N'ReadOnlyDB_log', FILENAME = N'C:\Temp\ReadOnlyDB_log.ldf' , SIZE = 20480KB , FILEGROWTH = 10%)
GO
USE MASTER
GO
ALTER DATABASE [ReadOnlyDB] SET READ_ONLY
GO

The basic script above is creating our database and then setting the attribute to ReadOnly. Let us start our various tests to validate our understanding.

USE ReadOnlyDB
GO
-- Creating our table
CREATE TABLE tbl_SQLAuth (id INT, Longname CHAR(8000))
GO

As per our understanding, this must raise an error as shown below:

Msg 3906, Level 16, State 1, Line 15

Failed to update database “ReadOnlyDB” because the database is read-only.

Temp Tables: As discussed above, the below query in the ReadOnlyDB context willnot raise any error.

-- Creating our Temp Tables
CREATE TABLE #t1 (i INT)
GO
DROP TABLE #t1
GO
-- Creating our Global Temp Tables
CREATE TABLE ##t1 (i INT)
GO
DROP TABLE ##t1
GO

Stored procedure creation: If we try to create any objects in our ReadOnlyDB, we will get the same error of 3906.

– Create the stored procedure inside ReadOnlyDB

-- Create the stored procedure inside ReadOnlyDB
CREATE PROC prc1
AS
BEGIN
SELECT
1
END
GO

Msg 3906, Level 16, State 1, Procedure prc1, Line 30

Failed to update database “ReadOnlyDB” because the database is read-only.

Shrink Database command: Shrink database question was an interesting one for me. But on second look, this is logical. Since shrink database will try to alter the header in our mdf file, in the readonly mode that is not allowed. Hence we will get the error.

DBCC SHRINKDATABASE (ReadOnlyDB, TRUNCATEONLY);
GO

Msg 7992, Level 16, State 1, Line 4

Cannot shrink ‘read only’ database ‘ReadOnlyDB’.

All these errors are simple yet questions that come to many of us. This blog post is a dedication to those who took time in writing a line to me even after close to 7+ years of blogging. I get to learn from each one of you out there. SQL Server is an ocean and you make me complete.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Cloud Based Data Integration Made Easy – A Real World Scenario

If you are a DBA, once in a while, you will have a situation where you end up with some tasks which will be boring and annoying. Trust me in my life, I often come across similar scenarios quite often. Here is one such tasks I came across a few days ago.

A Task, I would like to Automate

Just the other day I was assigned a task where I had to take a CSV file from my network and insert into SQL Server which was hosted in remote locations. Well, you may think it as a pretty easy task and I agree with you this is very easy task. The challenge is not about its difficulty, but the challenge was rather about the entire process and my interest. The part which annoyed me the most was that I have to do this every day at 4 PM.

This means, every day I must be at my desk at 4 PM and take a file from the network and upload to remote SQL Server. What about weekends? What about when I have to step away from my desk at 4 PM? What about the situation, when I am busy doing something much more important than this task? Well, as I said, more than task, I have been just one-place with the routine which was associated with it. In simple words, this was an ETL task which needed to be automated, but I can’t depend on my machine always. I have to find a solution which was cloud based and runs efficiently.

Skyvia at Rescue

I was sitting miffed in office and suddenly I remembered that last year I blogged about the tool Skyvia. Here is the blog post Integrate Your Data with Skyvia – Cloud ETL Solution. I quickly referred to my own blog, post and realized I should give Skyvia a try.

What is Skyvia?

Skyvia is a powerful solution for cloud data integration (ETL) and backup. It is a complete online solution and does not require any local software installed except for a web browser. In Skyvia we can create integration packages that define the operations and then we can run them or schedule for automatic execution. An integrated package is a set of data ETL operations that can be stored for future use, executed, modified, or deleted. Skyvia provides several kinds of packages for different integration scenarios. They are Import, Export, Replication, and Synchronization packages.

How did I do it?

Well, here are few of the screenshots of the task which I was assigned.

First, I checked if the table where I have to export data exists or not. As the table was already created, I quickly checked if it contained data or not. The table contained no data.

Next we will open the Skyvia web interface. It is pretty simple and it will list three options on the left bar. We will click Integration there.

In the Integration section, click Create Now under Data Import.

In the data integration screen we will be presented with various options. We will load the CSV file from an FTP server, so we select source as a CSV from FTP and target as SQL Server.

As we will be connecting SQL Server for the very first time we will be creating new connection and that is pretty straight forward procedure.

Then we will configure an FTP connection

Next we will configure CSV options. Here will be providing various options, but in our case all the default options were good enough for us to move next.

Right after that we will select the target table. In our case the target table is actor table.

The next screen will present mapping and we will one more time review various mapping options. We will make sure that all the source and target columns maps correctly.

When we click finish it will bring up the following screen.

Click on Save and now we are back on the following screen. Over here we can execute our task and see if it works or not. Click on the RUN button on the right side of the screen.

In my case the task ran successfully and it shows that it has inserted 200 rows successfully. The time taken to complete this entire task was 35 seconds and it depends on my network connection to the destination server.

We can execute the same select statement which we had executed earlier and see if the table contains the valid data.

Once we commit that our task has worked successfully, we can create a daily schedule.

That’s it! I am done.

Now every day at specific time the task will automatically execute and will log history.

Next Action Items

Team Devart has created Skyvia a feature rich service. One has to experiment with various different options to fully see the vast capability of this amazing product. Here are few things you can consider doing it. Here is the link where you can sign up for Skyvia for totally FREE. Next I will be trying out Skyvia with Salesforce. Skyvia is an all-in-one cloud solution for various Salesforce data integration scenarios. In addition to the standard Salesforce data loader functionality – data import and data export – it offers powerful data replication and synchronization tools and backup functionality with preview and export of backed up data, data search, viewing data changes between backups, and one-click restore.

Sign up for Skyvia for totally FREE.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Quiz with DATEADD Function

How many of us use some sort of date or datetime function when working with SQL Server? I cannot think of a single developer who would ever say they don’t work with datetime conversions in their application. Almost every application needs some manipulation of datetime datatypes. There are a number of pitfalls that can get into which we might not be aware. Here is a simple blog that I would love to hear your views on what the output would be:

Quick Trivia

Make your guess to what is the value for the below query? Guess before executing the same on SQL Server Management Studio.

SELECT DATEADD(MONTH, 1, '2015-01-28')
SELECT DATEADD(MONTH, 1, '2015-01-29')
SELECT DATEADD(MONTH, 1, '2015-01-30')
SELECT DATEADD(MONTH, 1, '2015-01-31')

Do you see something strange in the output? Will you be able to explain why we are getting this output? Why are the values so similar? I am sure once you execute the query in SSMS – the answer will be easy.

If the output is still confusing, hang on. Now what would be the values for the below query?

SELECT DATEADD(MONTH, 1, '2016-01-28')
SELECT DATEADD(MONTH, 1, '2016-01-29')
SELECT DATEADD(MONTH, -1, '2015-03-30')
SELECT DATEADD(MONTH, -1, '2015-03-31')

It is important to note is that DATEADD can either use a positive or negative integer as part of adding value. I am sure, I did trick you with the above query. Now the explanation becomes easy. Write your explanation in the comments and I will try to give you a special prize of one month free subscription to Pluralsight for five correct valid comments to this question.

Part 2

As I wrap up this blog, I would like to show how one of my friends made a small mistake while using the DATEADD function and how they got unexpected results. While using DATEADD, they accidentally wrote a YEAR function instead of “month”.

SELECT DATEADD(YEAR, 8000, '2015-01-31');

If you execute the above query, we will be presented with an Error. This is part II of the trivia. This will be an added bonus if you can explain.

Msg 517, Level 16, State 1, Line 9
Adding a value to a ‘datetime’ column caused an overflow.

Call to Action

1) Write the answer to first quiz – Why specific outcome when you execute queries with DateAdd and month?

2) Write the answer to second quiz – Why specific error on the screen?

Leave answer before February 20th, 2015 in comment section to eligible for price of free Pluralsight subscription.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Installation Error – INSTALLSHAREDDIR parameter is not valid because this directory is compressed or is in a compressed directory

One of my blog reader contacted me for assistance. Here is his email:

Hello Pinal,
I am trying to install SQL Server on my laptop but getting below errors:

  • SqlServer.Configuration.Sco.DirectoryAttributesMissmatch: Folder C:\Program Files\Microsoft SQL Server has an unsupported attribute (Compressed) set. Please resolve this issue by removing the unsupported attribute from the folder using folder properties dialog.
  • SqlServer.Configuration.SetupExtension.CompressedDirException: The specified directory, “C:\Program Files\Microsoft SQL Server\”, for the INSTALLSHAREDDIR parameter is not valid because this directory is compressed or is in a compressed directory. Specify a directory that is not compressed.
  • SqlServer.Configuration.SetupExtension.CompressedDirException: The specified directory, “C:\Program Files (x86)\Microsoft SQL Server\”, for the INSTALLSHAREDWOWDIR parameter is not valid because this directory is compressed or is in a compressed directory. Specify a directory that is not compressed.

Can you help me in understanding the error and the fix of the issue?

Since the error messages were talking about compression so I checked MSDN to find whether SQL can be installed on compressed directory or not. And I found https://msdn.microsoft.com/en-us/library/ms143506.aspx (Hardware and Software Requirements for Installing SQL Server). This says “SQL Server Setup will block installations on read-only, mapped, or compressed drives”

Having said all this, in this era of faster disks and infinite storage. I am still wondering how many of you out there is using compression on your disks for files stored by applications? If we read the message it, the solution is simple. Here are the steps to un-compress a folder.

  • On the desired folder (in our case “C:\Program Files”) right click and go to properties.
  • Click on “Advanced” button.
  • Uncheck the box “Compress the drive to save space”
  • Click Apply and then Ok.
  • Let the process finish.

Later I got message from the reader that he was able to install SQL without any further issues. As I say always, most of the error messages are self-explanatory. All we need is to understand how the system is being used. In this context it is worth to know there are few KB Articles (231347)

Reference: Pinal Dave (http://blog.sqlauthority.com)

Interview Question of the Week #007 – How to Reindex Every Table of the Database?

Some questions are extremely popular questions and they never get old. Here is one such question which I see very often asked to DBAs in their early career.

Question: How to re-index every table of the database?

Answer: Well, The answer of this question can be only given in the form of the script.

For SQL Server 2014 and later version

DECLARE @TableName VARCHAR(255)
DECLARE @sql NVARCHAR(500)
DECLARE @fillfactor INT
SET
@fillfactor = 80
DECLARE TableCursor CURSOR FOR
SELECT
OBJECT_SCHEMA_NAME([object_id])+'.'+name AS TableName
FROM sys.tables
WHERE is_memory_optimized = 0
OPEN TableCursor
FETCH NEXT FROM TableCursor INTO @TableName
WHILE @@FETCH_STATUS = 0
BEGIN
SET
@sql = 'ALTER INDEX ALL ON ' + @TableName + ' REBUILD WITH (FILLFACTOR = ' + CONVERT(VARCHAR(3),@fillfactor) + ')'
EXEC (@sql)
FETCH NEXT FROM TableCursor INTO @TableName
END
CLOSE
TableCursor
DEALLOCATE TableCursor
GO

(Remember that alter index will fail on in-memory table, hence they needs to be excluded)

For SQL Server 2005, 2008 and 2012 versions

DECLARE @TableName VARCHAR(255)
DECLARE @sql NVARCHAR(500)
DECLARE @fillfactor INT
SET
@fillfactor = 80
DECLARE TableCursor CURSOR FOR
SELECT
OBJECT_SCHEMA_NAME([object_id])+'.'+name AS TableName
FROM sys.tables
OPEN TableCursor
FETCH NEXT FROM TableCursor INTO @TableName
WHILE @@FETCH_STATUS = 0
BEGIN
SET
@sql = 'ALTER INDEX ALL ON ' + @TableName + ' REBUILD WITH (FILLFACTOR = ' + CONVERT(VARCHAR(3),@fillfactor) + ')'
EXEC (@sql)
FETCH NEXT FROM TableCursor INTO @TableName
END
CLOSE
TableCursor
DEALLOCATE TableCursor
GO

For SQL Server 2000 version

DECLARE @MyTable VARCHAR(255)
DECLARE myCursor
CURSOR FOR
SELECT
table_name
FROM information_schema.tables
WHERE table_type = 'base table'
OPEN myCursor
FETCH NEXT
FROM myCursor INTO @MyTable
WHILE @@FETCH_STATUS = 0
BEGIN
PRINT
'Reindexing Table:  ' + @MyTable
DBCC DBREINDEX(@MyTable, '', 80)
FETCH NEXT
FROM myCursor INTO @MyTable
END
CLOSE
myCursor
DEALLOCATE myCursor
EXEC sp_updatestats
GO

Well, there are many different methods and many different variations out there for this script, however, above script has always worked for me and I trust them.

Here are few related blog posts one should refer for further information.

Reference: Pinal Dave (http://blog.sqlauthority.com)