SQL SERVER – The Basics of the Execute Package Task – Notes from the Field #067

[Note from Pinal]: This is a new episode of Notes from the Field series. SQL Server Integration Service (SSIS) is one of the most key essential part of the entire Business Intelligence (BI) story. It is a platform for data integration and workflow applications.

In this episode of the Notes from the Field series I asked SSIS Expert Andy Leonard a very crucial question – What are the Basics of the Execute Package Task and where do we start with it? Andy was very kind to answer the questions and provides plenty of information about how a novice developer can learn SSIS from the beginning and become expert in the technology.


Learning how to configure and use the SQL Server Integration Services (SSIS) Execute Package Task gives you a great starting point for understanding SSIS package orchestration. I advocate writing small, functional SSIS packages that perform a unit of work. That’s a great idea for many reasons. But it begs the question: “How do I string together the execution of these packages?” Remember: SSIS is a software development platform. With “SQL Server” included in the name, it is easy for people to confuse SSIS as a database tool or accessory, but Control Flow Tasks put that confusion to rest.

SSIS provides several Control Flow tasks. Here is a list that provides a good approximation of which tasks I use most, from most-used to least-used:

In this article I provide a basic example of configuring the SSIS Execute Package Task, shown in Figure 1:


Figure 1: SSIS Execute Package Task

The Execute Package Task provides one way to implement an SSIS Design Pattern for SSIS package execution known as the Parent-Child pattern. When an SSIS package uses the Execute Package Task to start another SSIS package, the package with the Execute Package Task is called the Parent and the package started by the Execute Package Task is called the Child.

The Execute Package Task changed between SQL Server 2008 R2 Integration Services and SQL Server 2012 Integration Services. The changes support the SSIS 2012 (and 2014) Catalog. There is a new property called ReferenceType which defaults to “Project Reference” in packages executing in Project Deployment Mode (the default mode for building SSIS 2012 and SSIS 2014 SSIS packages), as shown in Figure 2:


Figure 2: Options for the Execute Package Task ReferenceType Property

Project Reference is used to execute an SSIS package – a Child package – in the same SSIS project with the package that contains the Execute Package Task – the Parent package. When Project Reference is selected, the next property in the property grid is PackageNameFromProjectReference, a dropdown containing a list of all the SSIS package in the SSIS project.

Setting the ReferenceType property to External Reference is a way to execute SSIS packages that are stored in the file system or the msdb database. When External Reference is selected the next properties in the property grid change to reflect this backwards-compatible functionality, as shown in Figure 3:


Figure 3: Setting the ReferenceType Property to External Reference

The Location and Connection properties are used to specify an OLE DB Connection (to the msdb database) or a File Connection (to the location of the dtsx file). SQL Server locations also require the name of the SSIS package; the File System option does not because the SSIS package is the only thing in the file. The External Reference ReferenceType setting is useful for importing SSIS solutions that contain earlier versions of the Execute Package Task because External Reference behaves exactly like previous versions of the Execute Package Task.

The remaining properties in the property grid – Password and ExecuteOutOfProcess – are shared between ReferenceType options. Password is used if the SSIS package is password-protected (if the ProtectionLevel property of the SSIS package is set to either EncryptSensitiveWithPassword or EncryptAllWithPassword). ExecuteOutOfProcess is a setting that determines whether the package will be executed as part of the current process or a new process.

When executing SSIS packages in Project Deployment Mode (using the Project Reference ReferenceType), Parameter Bindings are enabled and allow values to be passed from the Parent package to the Child package, as shown in Figure 4:


Figure 4: Binding a Parent Package Parameter to a Child Package Parameter

Parameter Bindings are disabled when the ReferenceType property is set to External Reference.

Once configuration is complete, click the OK button to close the Execute Package Task Editor. You can test execution by pressing the F5 key or selecting “Start Debugging” from the SSIS dropdown menu. A successfully-executed Execute Package Task will appear as shown in Figure 5:


Figure 5: A Successful Execution!

The Execute Package Task drives a powerful data integration architecture pattern: Parent-Child execution. Using SSIS Precedent Constraints and Sequence Containers with the Execute Package Task, a data integration developer can develop SSIS “driver” packages that call Child SSIS packages in any combination of parallel and serial orders of execution.

If you want to get started with SSIS with the help of experts, read more over at Fix Your SQL Server.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – The Basics of the SSIS Script Task – Notes from the Field #065

[Note from Pinal]: This is a new episode of Notes from the Field series. SQL Server Integration Service (SSIS) is one of the most key essential part of the entire Business Intelligence (BI) story. It is a platform for data integration and workflow applications.

In this episode of the Notes from the Field series I asked SSIS Expert Andy Leonard a very crucial question – What are the basics of the SSIS script task and where do we start with it? Andy was very kind to answer the questions and provides plenty of information about how a novice developer can learn SSIS from the beginning and become expert in the technology.


Learning how to configure and use the SQL Server Integration Services (SSIS) Script Task gives you a great starting point for learning how to use SSIS Control Flow tasks. You’ll find that SQL Server Control Flow tasks provide powerful development capabilities that may surprise you. Let there be no mistake, SSIS is a software development platform. With “SQL Server” included in the name, it is easy for people to confuse SSIS as a database tool or accessory, but Control Flow Tasks put that confusion to rest.

SSIS provides several Control Flow tasks. Here is a list that provides a good approximation of which tasks I use most, from most-used to least-used:

This article will focus on a basic example of configuring the SSIS Script Task, shown in Figure 1. People often think the SSIS Script Task is just for generating log messages, but let’s look at some of its more powerful capabilities.

Figure 1: SSIS Script Task

The Script Task supports Microsoft Visual C# and Microsoft Visual Basic languages. You can choose which language you will use prior to clicking the Edit Script button. Once you’ve clicked the Edit Script button, you cannot change the Script Language property.

At the heart of the Script Task is the ability to read a variable value from the SSIS package into the script and then write a message out of the Script Task. Because the Script Task can interact with SSIS variables, you can use .Net code to manipulate and respond to variable values.

For the purpose of our demonstration, we will add code to the Script Task to create a message for SSIS logging.

Messaging is a key component of enterprise architecture. Messages are generated (published, raised, produced, etc.) by Senders and received (subscribed, consumed, requested, etc.) by Listeners. Messaging is a complex topic. Entire books are dedicated to the topic.

Add variables to the ReadOnlyVariables and ReadWriteVariables properties by clicking the ellipsis inside the value textbox for each property. When you click the ellipsis, a Select Variables dialog appears, as you see in Figure 2.

Figure 2: Adding Read Only Variables to the Script Task

SSIS variables use a two-part naming convention: <Namespace>::<VariableName>. I added the SSIS variables System::TaskName and System::PackageName. The selected variables are in the System namespace and contain the name of the task and package, respectively.

Click the Edit Script button to open the script editing window as shown in Figure 3:

Figure 3: The Script Task Script Editor

Add the following code beneath the instruction to Add your code here:

Dim sPackageName As String = _ Dts.Variables("System::PackageName").Value.ToString
Dim sMsg As String = "I am " & sPackageName
Dts.Events.Fireinformation(1001, sPackageName, sMsg, "", 0, True)

Listing 1: VB.Net Code

The first line of the code in Listing 1 creates a VB.Net variable named sPackageName and initializes the value of this variable, setting it to the value of the SSIS Variable named System::PackageName. This is one way to read SSIS Variable values into a Script Task’s script.

The second line of the code in Listing 1 creates a VB.Net variable named sMsg and initializes the value of this variable with the text, “I am” plus the value contained in the sPackageName VB.Net variable.

The third line of the code shown in Listing 1 calls the FireInformation method in the Dts.Events assembly. FireInformation accepts six arguments:

  1. InformationCode – an integer you can use to identify custom messages sent from the SSIS package.
  2. SubComponent – a string you can use to populate the source of the message.
  3. Description – a string containing the message you wish to send.
  4. HelpFile – a string containing the location of a Help file (should a help file exist).
  5. HelpContext – an integer pointing to the help context (should a help context exist).
  6. FireAgain – a Boolean value indicating whether the event should subsequently fire.

My VB code includes a reference to the FireAgain argument. Note: This MSDN post states: “Because firing of an event may be expensive, the run-time engine provides a mechanism for suppressing events that you are not interested in. Every event firing method has a FireAgain parameter. If the value of this variable is false, after the method returns, the caller will not fire this event again for the duration of the current execution.” I understand this to mean setting FireAgain to False would suppress future FireInformation events in my Script Task, but testing indicates this is not correct. This is important for you to know because you may opt to use the FireAgain argument to manage when you raise events. I do not use the FireAgain argument for this purpose. I typically manage the decision to raise events in conditional logic surrounding the Event-raising code.

A test execution of the package reveals an Information message recorded on the Progress / Execution Results tab as shown in Figure 4.

Figure 4: the Information Message on the Progress / Execution Results Tab

Information messages generated from Script Tasks will appear in SSIS logs and the SSIS Catalog. With this information, you can monitor the progress of SSIS execution while the package is running, or view the historical status of variable values or package messages after the package execution has completed.

The SSIS Script Task can accomplish much more than generating log messages. This brief introduction and basic example have demonstrated how to get started configuring and using the SSIS Script Task. As you can see, SSIS Script Tasks give you development capabilities you may not have been aware of.

If you want to get started with SSIS with the help of experts, read more over at Fix Your SQL Server.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – How to Know Backup History of Current Database?

Some blog post and scripts are like ever green references. One such blog has been Get Database Backup History for a Single Database. A number of you have pinged me and for amendments to that script with various ideas. Since that post was written more than 4+ years now, I thought it would be worthwhile to change it to current trends of SQL Server 2014 additions. What are the changes made to the previous script? The script will answer few questions like:

  • What were the different backups taken on the current DB? Remove the condition mentioned below and it can be generic enough for all DBs on your server.
  • Which user was involved in taking the backup?
  • What is the type of backup we are talking about?
  • Recovery Model and Database compatibility level of the DB at the time the backup was taken. I remember using this for a user, who used to change the DB compatibility in their script before a backup. It was basically a copy-paste problem from an internet script. This data helped there.
  • Size of backup – Both compressed and uncompressed.
  • If the Backup was password protected.
  • Finally, when the backups were taken.

So let us look at the script next. Feel free to modify the same as you wish.

-- Recent backup history for the current DB
SELECT s.database_name 'Database',
s.recovery_model 'Recovery Model',
s.compatibility_level,
s.USER_NAME 'Backup by Username',
CASE s.TYPE
WHEN
'D' THEN 'Full'
WHEN 'I' THEN 'Diff'
WHEN 'L' THEN 'Log'
END 'Backup Type',
CONVERT(VARCHAR(20), s.backup_finish_date, 13) 'Backup Completed',
CAST(mf.physical_device_name AS VARCHAR(100)) 'Physical device name',
DATEDIFF(minute, s.backup_start_date, s.backup_finish_date) 'Duration Min',
CAST(ROUND(s.backup_size * 1.0 / ( 1024 * 1024 ), 2) AS NUMERIC(10, 2)) 'Size in MB',
CAST(ROUND(s.compressed_backup_size * 1.0 / ( 1024 * 1024 ), 2) AS NUMERIC(10, 2)) 'Compressed Size in MB',
CASE WHEN LEFT(mf.physical_device_name, 1) = '{' THEN 'SQL VSS Writer'
WHEN LEFT(mf.physical_device_name, 3) LIKE '[A-Za-z]:\%' THEN 'SQL Backup'
WHEN LEFT(mf.physical_device_name, 2) LIKE '\\' THEN 'SQL Backup'
ELSE mf.physical_device_name
END 'Backup tool',
s.is_copy_only,
s.is_password_protected,
s.is_force_offline /* for WITH NORECOVERY option */
FROM   msdb.dbo.backupset s
INNER JOIN msdb.dbo.backupmediafamily mf ON s.media_set_id = mf.media_set_id
WHERE  s.database_name = DB_NAME() AND  -- remove this condition if you want all DBs
s.backup_finish_date > DATEADD(MONTH, -3, GETDATE()) -- Get data for past 3 months
ORDER BY s.backup_finish_date DESC;

A sample output would look like:

There can be more additional fields added to this script like: If encryption is enabled, Collation information, LSN information, if backup has checksum etc. Do let me know which additional information do you use in your environment to know your backups better.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Can Database Primary File Have Any Other Extention Than MDF

Here is interesting question I received yesterday. I personally think it is one of the most interesting questions I have received this month.

“Can Database Primary File Have Any Other Extention Than MDF?”

Before you read the answer – think for a moment.

Now read the answer in the next line.

Yes, Of course. MDF file extension of the primary data file of the database is just a normal practice, but essentially, you can use any other database file extension as well.

Here is an example where I am able to attach files with PDF extension to the database. Please note that this file was created by SQL Server earlier and was always used for primary data file.

USE [master]
GO
CREATE DATABASE [tests] ON
( FILENAME = N'D:\data\tests.pdf' ),
(
FILENAME = N'D:\data\tests_log.ldf' )
FOR ATTACH
GO

As you can see that other extensions are successfully restored to the database. If you want to practice this example, you can download this example by clicking over here.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – The Basics of the SSIS Data Flow Task – Notes from the Field #057

[Note from Pinal]: This is a new episode of Notes from the Field series. SQL Server Integration Service (SSIS) is one of the most key essential part of the entire Business Intelligence (BI) story. It is a platform for data integration and workflow applications. As wikipedia says – It features a fast and flexible data warehousing tool used for data extraction, transformation, and loading (ETL). The tool may also be used to automate maintenance of SQL Server databases and updates to multidimensional cube data.

In this episode of the Notes from the Field series I asked SSIS Expert Andy Leonard a very crucial question – How to learn SSIS data flow task? Andy was very kind to answer the questions and provides plenty of information about how a novice developer can learn SSIS from the beginning and become expert in the technology.


If you know SQL Server, you’re likely aware of SQL Server Integration Services (SSIS). What you might not realize is that SSIS is a development platform that allows you to create and perform some interesting Control Flow Tasks. In the first blog post in this series, I showed how to use the Execute SQL Task. Now, let’s look at the Data Flow Task. When developing solutions with SSIS, I use a handful of Control Flow tasks:

  • Execute SQL Task
  • Data Flow Task
  • Script Task
  • Execute Package Task
  • File System Task
  • Execute Process Task

This list is a good approximation of which tasks I use most, too – from most-used to least-used. In this article I provide a basic example of configuring the SSIS Data Flow Task, shown in Figure 1:


Figure 1: SSIS Data Flow Task

The SSIS Data Flow Task is a very special task. It is the only task to have its own tab in the Integrated Development Environment (IDE) as shown in Figure 2:


Figure 2: The Data Flow Tab

If you click on the tab, you will note a new SSIS Toolbox containing Data Flow-specific components, as shown in Figure 3:


Figure 3: Data Flow SSIS Toolbox

SSIS Data Flows are typically used to move data from one location to another. The data flow accomplishes data movement by first reading data into Data Flow Buffers. Think of a buffer as a region of memory SSIS uses to hold data rows as the rows are processed by the data flow. In Figure 4, I have configured an OLE DB Source Adapter to pump data rows into the data flow:


Figure 4: Configuring an OLE DB Source Adapter

The data is often transformed while being moved from one location to another. The SSIS data flow components that perform transforming operations are called Transformations, and they are joined to other data flow components by Data Flow Paths. An example of a transformation is the Derived Column Transformation, as shown in Figure 5:


Figure 5: Adding a Derived Column Transformation and a Data Flow Path

You can use transformations to perform many operations (e.g., you can manipulate values of columns in rows, you can remove or redirect rows based on column values, etc.) on the data as it flows through the data flow task. For example, the Derived Column Transformation permits you to manipulate (transform) existing data or to combine existing data to create new columns, as shown in Figure 6:


Figure 6: Creating a New Column with the Derived Column Transformation

I created a new column named “UpperCaseName” in the Derived Column Transformation. I used SSIS Expression Language to define the transform – “UPPER([Name])” in this case.

Note: SSIS Expression Language is very powerful and very difficult to learn. For more information, please see Linchpin People’s SSIS Expression Language series.

Now I need to land these rows into a table. I’ll use an OLE DB Destination Adapter – connected from the Derived Column Transformation via data flow path – to accomplish loading our transformed rows into a table, as shown in Figure 7:


Figure 7: Configuring an OLE DB Destination Adapter

Once the OLE DB Destination Adapter is configured, you can execute either the package or the Data Flow Task as shown in Figure 8:


Figure 8: Test Execution Successful!

In this article, I shared an introduction to the SSIS Data Flow Task and some of its functionality. Although I barely scratched the surface of Data Flow capabilities, you should now be able to compose and test your first SSIS Data Flow!

:{>

If you want to get started with SSIS with the help of experts, read more over at Fix Your SQL Server.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Proof of Concepts with Windows Azure – Notes from the Field #056

[Note from Pinal]: This is a 56th episode of Notes from the Fields series. If you are DBA and Developer there is always a situation when you want to prove yourself by building a small proof of concepts of your idea. However, most of the time, it is way more complicated than we think. Building proof of the concepts required many different resources and skills. Above all there are chances that what we have built is not upto the mark and we have to rebuild the example one more time. Trust me it is never simple those tasks which looks simple from the beginning.

In this episode of the Notes from the Field series database expert John Sterrett (Group Principal at Linchpin People) explains a very common issue DBAs and Developer faces in their career – how to build proof of concepts and how to maximize the power of Azure. Linchpin People are database coaches and wellness experts for a data driven world. Read the experience of John in his own words.


Whether you know it or not cloud services are here and they are changing the way we will provide information technology services. For example, in many information technology shops it can take weeks if not months to get an instance of SQL Server up and running.  Here are some minimal action items that must be completed before DBA’s get access server to install SQL Server.  You have to order a physical server, your procurement team must approve the order, and the server has to be shipped. Once the server is received the server must be racked in the data center, cables must be connected, and the data center team needs to document their changes. Then the operations team needs to install and configure windows. I could keep going but there are a lot of things that should be done to a server before the DBA team gets its hands on it. What are you going to do if you’re a DBA and you need instance up in 30 minutes for a proof of concept? It’s becoming more common that the cloud is the answer.

Every time I need a server for a proof of concept I jump to Windows Azure.  I can quickly build a Windows Azure Machine with SQL Server provided within 30 minutes.  In this tip, I am going to walk through the steps to create your first Windows Azure Machine.

1. Get Azure Windows Account. If you don’t have one you can get a free trial. At the time of writing this tip your free trial would include $220 of credit.
http://azure.microsoft.com/en-us/pricing/free-trial/

Are you an MSDN Subscriber who isn’t leveraging Microsoft’s cloud service? If so, you can sign up for free monthly credit.

2. Log In to Azure Portal
https://manage.windowsazure.com

3. Create New Virtual Machine

On the left hand side, click on Virtual Machines and then the add button on the bottom of the left side of the screen. This will load our wizard for creating our first virtual machine.

Now that wizard is loaded as you can see below we can select virtual machine and create it from the gallery. In the Gallery we will be able to select one of many images used that already includes SQL Server baked in.

Looking at the SQL Server images you will see you can access Enterprise Edition, Standard Edition and Web Edition for SQL 2014 down to SQL 2008 R2.

 

Next you can customize your image by release date. This will allow you to have different service packs or CUs. You can also select between two different tiers and sizes.  You will have to create a user name and password and you will want to keep this credential as it will be your first account.

 

Next you will be able to select more machine configuration options. You will get to determine where the Azure Virtual Machine is located.  Below you will see I am using my MSDN Subscription.

 

Finally, you will get to configure more configuration extensions to help automate or secure your virtual machine.

Finally, you will see your server being provisioned. What once use to take weeks or months can now be done in the cloud in minutes.

Are your servers running at optimal speed or are you facing any SQL Server Performance Problems? If you want to get started with the help of experts read more over here: Fix Your SQL Server.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – FIX – The database ‘model’ is marked RESTORING and is in a state that does not allow recovery to be run

The best way one can learn SQL Server is by trying out things on their own and I am no different. I constantly am trying to explore the various options one can use when working with SQL Server. In the same context, when I was playing around with backup restore commands, I made a mistake and unfortunately restarted SQL Server. After that I was unable to start SQL Service.  If I start the service, it doesn’t give any error but gets stop automatically.

Whenever I have any weird problems with SQL, I always look at ERRORLOG files for that instance. If you don’t know the location of Errorlog, you should refer Balmukund’s blog (Help : Where is SQL Server ErrorLog?)

This is what I found in ERROLROG just before the stop.

2014-10-28 002039.02 spid9s      Starting up database 'model'.
2014-10-28 002040.01 spid9s      The database 'model' is marked RESTORING and is in a state that does not allow recovery to be run.
2014-10-28 002040.04 spid9s      Error 927, Severity 14, State 2.
2014-10-28 002040.04 spid9s      Database 'model' cannot be opened. It is in the middle of a restore.

The error and behavior which I am seeing makes sense because to start SQL Server, we need master, model and tempdb database. You might think that MSDB is also a system database and would be needed for SQL Engine? Well, you might have been tricked. MSDB is needed for SQL Server Agent Service, not SQL Server Service. So, my master is fine, model has some problem. Every new database is created using model, including TempDB so SQL Service is refusing to start. Since the model database is not recovered successfully, SQL Server cannot create the tempdb database, and the instance of SQL Server does not start understandably.

So I called up Balmukund – these are the perks of having a good friend to rely. He never says “no” but he also doesn’t give complete solution to the problem. He gives hint and asks me to research further. This time also magical words were – “use trace flag 3608 and restore model with recovery”.

I followed his advice and performed below steps.

1. Start SQL Server with trace flag 3608 using net start command

Net Start MSSQL$SQL2014 /T3608

In my case SQL2014 is the name of the instance. If you have default instance then service name would be MSSQLServer. For named instance, it is MSSQL$InstanceNameHere

2. After starting with trace flag 3608, I verified the same from Errorlog as well.

Further, I also found below message in ERRORLOG.

Recovering only master database because traceflag 3608 was specified. This is an informational message only. No user action is required.

3. Connected to SQL Instance using SQLCMD by below command.

SQLCMD -S .\SQL2014 -E

You can read parameter of SQLCMD at Books online here

“1>” means we are connected to SQL Instance and then Executed below command (hit enter at end of each line)

RESTORE DATABASE Model WITH RECOVERY
GO

4. Once the command is executed successfully, we will come back to “1>” again. We can type exit to come out of SQLCMD

5. Now stop SQL Service

Net Stop MSSQL$SQL2014

6. And start again without trace flag.

Net Start MSSQL$SQL2014

Now my SQL instance came up happily and I was unblocked.  After sometime I got call from Balmukund asking if SQL is started and I told that my next blog is ready on the same topic.  He finally asked, how did that happen? And my answer was – I ran wrong command. My command was

BACKUP DATABASE model TO DISK = 'Full.bak'
GO
BACKUP LOG model TO DISK = 'Log.trn' WITH NORECOVERY
GO

My request to my reader is that please DONOT run the above command in your SQL instance and restart SQL else you need to follow the steps in production server. Learning never stops when working with SQL Server.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – FIX: ERROR : Msg 3136, Level 16, State 1 – This differential backup cannot be restored because the database has not been restored to the correct earlier state

During my recent visit to customer site for a session on backups, they asked me to find the cause of the error while restoring a differential backup. Though this seemed to be completely an admin related topic and I had gone for some other session, I took the challenge head-on. These are wonderful ways to explore and learn SQL Server better. The error they showed me was:

Msg 3136, Level 16, State 1, Line 39
This differential backup cannot be restored because the database has not been restored to the correct earlier state.
Msg 3013, Level 16, State 1, Line 39
RESTORE DATABASE is terminating abnormally.

In this blog post I will try to explain about the error in detail. In the same context, long time back, I did write a blog post on: SQL SERVER – Backup Timeline and Understanding of Database Restore Process in Full Recovery Model

Over there, I have explained details and co-relation of the various backup type i.e. Full, Differential and Transaction Log backups. I will refrain from rehashing them here again.

Recently, one of my friends asked about if we have differential backup, how we can find the full backup on which differential backup can be restored. If we go back to basics, the differential backup has all the changes in the database made since last full backup was taken.

Let us understand this concept using an example:

CREATE DATABASE SQLAuthority
GO
USE SQLAuthority
GO
CREATE TABLE t1 (i INT)
GO
BACKUP DATABASE SQLAuthority TO DISK = 'E:\temp\F1.bak'
GO
INSERT INTO t1 VALUES (1)
GO
BACKUP DATABASE SQLAuthority TO DISK = 'E:\temp\D1.bak' WITH DIFFERENTIAL
GO
INSERT INTO t1 VALUES (2)
GO
BACKUP DATABASE SQLAuthority TO DISK = 'E:\temp\D2.bak' WITH DIFFERENTIAL
GO
INSERT INTO t1 VALUES (3)
GO
BACKUP DATABASE SQLAuthority TO DISK = 'E:\temp\F2.bak'
GO
INSERT INTO t1 VALUES (4)
GO
BACKUP DATABASE SQLAuthority TO DISK = 'E:\temp\D3.bak' WITH DIFFERENTIAL
GO

Once the script has been run we have below backups.

Looking at the backup chain, it is clear that D3 is valid for F2. On the other hand D1 and D2 are valid and restorable on top of F1. Let us drop the database and try to restore D3 on top of F1.

USE MASTER
GO
DROP DATABASE SQLAuthority
GO
RESTORE DATABASE SQLAuthority FROM DISK = 'E:\temp\F1.bak' WITH NORECOVERY
GO
RESTORE DATABASE SQLAuthority FROM DISK = 'E:\temp\D3.bak' WITH NORECOVERY
GO

Here is the output.

Processed 296 pages for database 'SQLAuthority', file 'SQLAuthority' on file 1.
Processed 6 pages for database 'SQLAuthority', file 'SQLAuthority_log' on file 1.
RESTORE DATABASE successfully processed 302 pages in 0.213 seconds (11.076 MB/sec).
Msg 3136, Level 16, State 1, Line 43
This differential backup cannot be restored because the database has not been restored to the correct earlier state.
Msg 3013, Level 16, State 1, Line 43
RESTORE DATABASE is terminating abnormally. 

This means that first restore was successful and next one has error which means that this is not a valid differential backup to be restored. How would we figure out the correct sequence of restore? Well, there are multiple ways.

1. Have a look at SQL Server ErrorLog where we have successful backup messages. Here is what we saw in ERRORLOG while running above backups.

As highlighted above, we can find the full back up LSN from the message of differential backup.

2. Have a look at Standard Reports to find previous backup events.

SQL SERVER – SSMS: Backup and Restore Events Report

3. Run below query on the server from where backup was taken.

SQL SERVER – Get Database Backup History for a Single Database

Hope fully this blog demystifies and tells you usefulness of the messages in ERRORLOG and logging capability of SQL Server. Do let me know if you have ever encountered these errors in your environments.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – FIX: ERROR : Msg 3023, Level 16, State 2 – Backup, file manipulation operations (such as ALTER DATABASE ADD FILE) and encryption changes on a database must be serialized

Errors are the best way to learn how SQL Server works and as DBA’s we are bound to see many of them from time to time. One of the primary functions of a DBA would include creating backups and most importantly trying to automate the same using jobs and maintenance plans.

Here is a typical scenario which a DBAs can encounter. One fine day they notice that some backup jobs are failing for no reason. Normal troubleshooting always starts with an error message. Recently, one of my blog readers sent an email to me which was worth a look.

I am getting below error. What is the cause and solution?

Msg 3023, Level 16, State 2, Line 1
Backup, file manipulation operations (such as ALTER DATABASE ADD FILE) and encryption changes on a database must be serialized. 
Reissue the statement after the current backup or file manipulation operation is completed.

I pinged him on twitter and asked more details. He informed that they have a job which runs and fails with the error described above. I asked him to get more details about the job and post back. I also asked him to check details from my good friend Balmukund’s blog – query to find what is running at the same time when job runs. He didn’t come back to me – that means his issue might be resolved.

But that left me curious to find the possible causes of the error Msg 3023, Level 16, State 2. Reading the message again, it looks like two parallel backups would cause error. So I ran two parallel backup command for a database which was little big in size (100GB). As soon as two full backups started, I could see that only one backup was making progress (session id 57) and another (session id 58) was waiting for first one to finish.

Which means the error is not raised and backup is waiting. But as soon as I cancelled the query (session 58), I got below message.

Another possible reason of the error is that if we perform shrink operation in parallel to backup operation. (Shrink is NOT something which I recommend, but people would never listen)

Here is the text

Msg 3140, Level 16, State 5, Line 1
Could not adjust the space allocation for file 'SQLAuthority'.
Msg 3023, Level 16, State 2, Line 1
Backup, file manipulation operations (such as ALTER DATABASE ADD FILE) and encryption changes on a database must be serialized. 
Reissue the statement after the current backup or file manipulation operation is completed.

Depending on who came first, here is the behavior. If a backup is started when either add or remove file operation is in progress, the backup will wait for a timeout period, then fail. If a backup is running and one of these operations is attempted, the operation fails immediately.

Solution: Find out the conflicting operation and retry your operation after stopping or finishing conflicting operation.

Learning using error messages is a great way to understand what happens inside SQL Server. Do let me know in the recent past, what have you learnt from error messages in your environments.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Unable to Restore From Backup After Detach Operation

While doing community work I travel a lot, speak at various conferences and get a chance to meet many new faces. The best part is that I get a chance to hear a variety of issues which people face while using SQL Server. This blog post is an outcome of one such interaction with a DBA from one of the organizations I had to meet.

After the conference a young guy came to me and said – “I found a bug in SQL Server Restore”. I was amazed with his confidence and asked him to tell more before concluding. He said that to restore a database from backup, you need to have same database created before restore. I told that there is something which is not right with the test which he is performing because that doesn’t sound correct. I gave him my email address and asked to contact me to find more. I was eagerly waiting for his mail as this was on top of my mind and I was restless for two days. Finally the mail landed

He sent an email repro steps.

  1. Create new database.
  2. Take a backup of the database.
  3. Detach the database.
  4. Restore from backup taken in step 2. This step would fail.

I followed the same steps

CREATE DATABASE SQLAuth
GO
BACKUP DATABASE SQLAuth TO DISK = 'C:\Temp\SQLAuth.bak' WITH FORMAT
GO
sp_detach_db 'SQLAuth'
GO
RESTORE DATABASE SQLAuth FROM DISK = 'C:\Temp\SQLAuth.bak'
GO

As soon as I run the last command of restore, I get below error

Msg 3142, Level 16, State 1, Line 7
File "SQLAuth" cannot be restored over the existing "E:\Program Files\Microsoft SQL Server\MSSQL12.SQL2014\MSSQL\DATA\SQLAuth.mdf". 
Reissue the RESTORE statement using WITH REPLACE to overwrite pre-existing files, or WITH MOVE to identify an alternate location.
Msg 3142, Level 16, State 1, Line 7
File "SQLAuth_log" cannot be restored over the existing "E:\Program Files\Microsoft SQL Server\MSSQL12.SQL2014\MSSQL\DATA\SQLAuth_log.ldf". 
Reissue the RESTORE statement using WITH REPLACE to overwrite pre-existing files, or WITH MOVE to identify an alternate location.
Msg 3119, Level 16, State 1, Line 7
Problems were identified while planning for the RESTORE statement. Previous messages provide details.

Msg 3013, Level 16, State 1, Line 7
RESTORE DATABASE is terminating abnormally.

Error Message is very clear about the cause of restore failure. Since we detached the database, the mdf and ldf files are still available at the location when the database was created. It’s good that SQL is not over writing the files by itself unless we specify explicitly.

If you want to over write the files then we can use the “WITH REPLACE” clause to the command as shown below.

RESTORE DATABASE SQLAuth FROM DISK = 'C:\Temp\SQLAuth.bak' WITH REPLACE

If we don’t detach the database and perform restore on top of existing database like below

CREATE DATABASE SQLAuth
GO
BACKUP DATABASE SQLAuth TO DISK = 'C:\Temp\SQLAuth.bak' WITH FORMAT
GO
RESTORE DATABASE SQLAuth FROM DISK = 'C:\Temp\SQLAuth.bak'
GO

Then we will get a slightly different message as shown below:

Msg 3159, Level 16, State 1, Line 5
The tail of the log for the database "SQLAuth" has not been backed up. Use BACKUP LOG WITH NORECOVERY to backup the log if it contains work you do not want to lose. Use the WITH REPLACE or WITH STOPAT clause of the RESTORE statement to just overwrite the contents of the log.
Msg 3013, Level 16, State 1, Line 5
RESTORE DATABASE is terminating abnormally.

Again, this is a safety mechanism where a user has to confirm their actions. Recall the situation when you have an existing file in windows and you paste same file at same location – you always get warning. SQL Server is no different and I was pleasantly relived with the fact that this was not a bug inside SQL Server. I am glad the DBA did send me this information because it made me revalidate and play around with backups with SQL Server.

Reference: Pinal Dave (http://blog.sqlauthority.com)