SQL SERVER – Backup and Restore Behavior of ReadOnly Filegroup Databases

Last week I wrote about SQL SERVER – Marking Filegroup as ReadOnly with SQL Server and it got me interested into few key questions. One of the questions someone asked was, how will backups behave? Are there anything I need to know about ReadOnly Filegroups? Will these databases when restored take this setting along? So in this blog post, let me take few simple steps in the learning journey I had when working with ReadOnly filegroups.

TSQL Scripts

Let us start out by creating the database first.
CREATE DATABASE [ReadOnlyDB]
CONTAINMENT
= NONE
ON  PRIMARY
( NAME = N'ReadOnlyDB', FILENAME = N'C:\Temp\ReadOnlyDB.mdf' , SIZE = 4024KB , FILEGROWTH = 1024KB )
,
FILEGROUP [ReadOnlyDB_FG]
( NAME = N'ReadOnlyDB_FG', FILENAME = N'C:\Temp\ReadOnlyDB_FG.ndf' , SIZE = 4096KB , FILEGROWTH = 1024KB )
LOG ON
( NAME = N'ReadOnlyDB_log', FILENAME = N'C:\Temp\ReadOnlyDB_log.ldf' , SIZE = 20480KB , FILEGROWTH = 10%)
GO
-- Mark the filegroup read-only
ALTER DATABASE ReadOnlyDB MODIFY FILEGROUP ReadOnlyDB_FG READ_ONLY;
GO

I have gone ahead by marking the filegroup as Read_Only. Next I am going ahead with a FULL Backup.

BACKUP DATABASE [ReadOnlyDB]
TO  DISK = N'C:\Temp\ReadOnlyDB.bak'
WITH NOFORMAT, INIT,
NAME = N'ReadOnlyDB-Full Database Backup',
SKIP, NOREWIND, NOUNLOAD,  STATS = 10, CHECKSUM
GO

Next we will drop the database and restore the same.

-- Clean up time
USE MASTER
GO
DROP DATABASE ReadOnlyDB
GO
USE [master]
RESTORE DATABASE [ReadOnlyDB] FROM  DISK = N'C:\Temp\ReadOnlyDB.bak' WITH  FILE = 1,  NOUNLOAD,  STATS = 5
GO

In the above command, we restored the database back to the same location. Now let us go ahead and check the filegroup’s settings for read_only attribute.

USE ReadOnlyDB
-- Check the status
SELECT type_desc, physical_name, is_read_only
FROM sys.database_files
GO

You can see our secondary filegroup is still marked as read_only.

There is one important learning that I got in this experiment. The database fileroups that are marked as read_only is retained as part of fullbackup and when we restore such backups, these settings get carried forward in our restored database.

I hope you got some learning as part of this experiment. Are you using these concepts in your environments? Do let us know via the comments below.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – ReadOnly Databases with ReadOnly File Attribute

As part of continued series of ReadOny databases, the earlier blog had an interesting comment that read as some people were getting errors of 5120 and others. This got me curious and was in the pending list of solutions in some way.

One of the readers had mailed me saying, they attached a database and the databases came up as ReadOnly. They didn’t know why this happened and when they try to mark the database as ReadWrite, it was giving them an error.

Here are the steps to reproduce the situation. I have the database files in a fileshare. We will try to attach the same using the below command:

USE [master]
GO
CREATE DATABASE [ReadOnlyDB] ON
( FILENAME = N'C:\ReadOnlyDB\ReadOnlyDB.mdf'),
(
FILENAME = N'C:\ReadOnlyDB\ReadOnlyDB_log.ldf')
FOR ATTACH
GO

Note: In your case, you can have database files on a different folder.

As soon as the command is executed, we can see in the object explorer the following:

We can see that the database is marked as Read-Only on attach. This was strange. So based on the blog, let us try to make the database as ReadWrite using the following command:

USE [master]
GO
ALTER DATABASE [ReadOnlyDB] SET  READ_WRITE WITH NO_WAIT
GO

Now this raised the following error:

Msg 5120, Level 16, State 101, Line 36
Unable to open the physical file “C:\ReadOnlyDB\ReadOnlyDB.mdf”. Operating system error 5: “5(Access is denied.)”.
Msg 5181, Level 16, State 5, Line 36
Could not restart database “ReadOnlyDB”. Reverting to the previous status.
Msg 5069, Level 16, State 1, Line 36
ALTER DATABASE statement failed.

The catch to the error is hidden in the message. It says “Operating system error”. So let us go ahead and detach the DB to change the OS File attributes.

USE [master]
GO
EXEC MASTER.dbo.sp_detach_db @dbname = N'ReadOnlyDB'
GO

Next get to the File system attribute for the database and Log files to check for the ReadOnly attribute. In our example, the mdf file had this attribute enabled.

Please go ahead and uncheck the “Read-only” attribute from the file. That is it. Now attach the database after the change.

USE [master]
GO
CREATE DATABASE [ReadOnlyDB] ON
( FILENAME = N'C:\ReadOnlyDB\ReadOnlyDB.mdf'),
(
FILENAME = N'C:\ReadOnlyDB\ReadOnlyDB_log.ldf')
FOR ATTACH
GO

This got us back to normal life as far as our ReadOnly databases is concerned. Hope you learnt something new as I am learning something new almost every single day. Do let me know if you got a chance to learn so much working with ReadOnly databases in the past.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Database Stuck in “In Recovery” Mode After Restart

Read this blog and I am sure it will bring some instances in your environments. This is one of the common issues I have observed while working with SQL Server from long time now. In this blog we will discuss little details about the issue and possible action you might take.

Whenever there is a restart of SQL Server, all databases would undergo “Recovery” process. This is the stage where the database has to come back online in a consistent state. There are three sub-phases with-in the process. Discovery, Roll forward and Rollback. The names are pretty self-explanatory. Let me explain for those who are interested to learn in detail:

  • Analysis: This is the phase where SQL Server would go through the LDF file and build the in-memory structures to know how much work is needed in the next two phases.
  • Roll forward (redo): During the shutdown of the database, there might be transactions which are committed but not yet written to the MDF file via checkpoint.
  • Rollback (undo): If there were any transactions which were uncommitted then they have to be rolled back to bring the database to a consistent state.

When we would see database in “InRecovery” state?

  • Restart of SQL Server.
  • Database offline and online.
  • Restore of database from backup.

All of the above would is called “recovery” process of the database and all databases must go through three phases as explained earlier.

What should we do?

The very first thing I always check is ERRORLOG. In Errorlog, we should see the very first message in the database (TestMe is the name of the database):

Starting up database ‘TestMe’.

This means the files are opened and recovery is started. After sometime, you should see phase 1.

Recovery of database ‘TestMe’ (28) is 0% complete (approximately 37 seconds remain). Phase 1 of 3. This is an informational message only. No user action is required.
Recovery of database ‘TestMe’ (28) is 3% complete (approximately 36 seconds remain). Phase 1 of 3. This is an informational message only. No user action is required.

Once phase 1 is complete, it would go with Phase 2 and 3 as shown below.

Recovery of database ‘TestMe’ (28) is 3% complete (approximately 36 seconds remain). Phase 2 of 3. This is an informational message only. No user action is required.
Recovery of database ‘TestMe’ (28) is 0% complete (approximately 142 seconds remain). Phase 2 of 3. This is an informational message only. No user action is required.
Recovery of database ‘TestMe’ (28) is 7% complete (approximately 19 seconds remain). Phase 2 of 3. This is an informational message only. No user action is required.
Recovery of database ‘TestMe’ (28) is 15% complete (approximately 26 seconds remain). Phase 2 of 3. This is an informational message only. No user action is required.
Recovery of database ‘TestMe’ (28) is 21% complete (approximately 25 seconds remain). Phase 2 of 3. This is an informational message only. No user action is required.
Recovery of database ‘TestMe’ (28) is 27% complete (approximately 20 seconds remain). Phase 2 of 3. This is an informational message only. No user action is required.
Recovery of database ‘TestMe’ (28) is 34% complete (approximately 19 seconds remain). Phase 2 of 3. This is an informational message only. No user action is required.
Recovery of database ‘TestMe’ (28) is 41% complete (approximately 16 seconds remain). Phase 2 of 3. This is an informational message only. No user action is required.
Recovery of database ‘TestMe’ (28) is 48% complete (approximately 14 seconds remain). Phase 2 of 3. This is an informational message only. No user action is required.
Recovery of database ‘TestMe’ (28) is 55% complete (approximately 12 seconds remain). Phase 2 of 3. This is an informational message only. No user action is required.
Recovery of database ‘TestMe’ (28) is 62% complete (approximately 10 seconds remain). Phase 2 of 3. This is an informational message only. No user action is required.
Recovery of database ‘TestMe’ (28) is 69% complete (approximately 9 seconds remain). Phase 2 of 3. This is an informational message only. No user action is required.
Recovery of database ‘TestMe’ (28) is 75% complete (approximately 7 seconds remain). Phase 2 of 3. This is an informational message only. No user action is required.
Recovery of database ‘TestMe’ (28) is 82% complete (approximately 5 seconds remain). Phase 2 of 3. This is an informational message only. No user action is required.
Recovery of database ‘TestMe’ (28) is 88% complete (approximately 3 seconds remain). Phase 2 of 3. This is an informational message only. No user action is required.
Recovery of database ‘TestMe’ (28) is 95% complete (approximately 1 seconds remain). Phase 3 of 3. This is an informational message only. No user action is required.

And once it completes, you should use something similar.

3807 transactions rolled forward in database ‘TestMe’ (28). This is an informational message only. No user action is required.
0 transactions rolled back in database ‘TestMe’ (28). This is an informational message only. No user action is required.
Recovery is writing a checkpoint in database ‘TestMe’ (28). This is an informational message only. No user action is required.
Recovery completed for database TestMe (database ID 28) in 30 second(s) (analysis 1289 ms, redo 29343 ms, undo 72 ms.) This is an informational message only. No user action is required

The text in green color explains the three phases which I explained earlier.

What are the possible causes?

  • Huge size of transaction log file.
  • SQL restarted during a long running transaction.
  • Huge number of VLFs.
  • You might be hitting a bug which is fixed in SQL Server. I have referenced KB below.

List of known issues

If you are running SQL Server 2005, 2008, 2008 R2 or SQL 2012, please make sure you have applied fixes given in below.

http://support.microsoft.com/kb/2455009 (FIX: Slow performance when you recover a database if there are many VLFs inside the transaction log in SQL Server 2005, in SQL Server 2008 or in SQL Server 2008 R2)

http://support.microsoft.com/kb/2524743 (FIX: Recovery takes longer than expected for a database in a SQL Server 2008 or in a SQL Server 2008 R2 environment)

The fixes would help in speeding up the phases of recovery. Hope this blog helps you in a direction to look at some of the SQL Server behavior.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – ReadOnly Databases and Notes Using Snapshots

This seems to be the third week that I am writing about ReadOnly databases, this blog is in continuation of last weeks notes. When I wrote the notes from last week, one of my blog reader said if it is possible to get the ReadOnly error without making the database into a ReadOnly database. This got me thinking and I couldn’t sleep that day. And a quick call to my friend to get some ideas – the answer was simple – using Snapshots.

I couldn’t control my excitement and this triggered me to write this special case scenario. As we execute the script, we will learn some of the behavior.

Let us start by creating our database which will be used for testing.

CREATE DATABASE [ReadOnlyDB]
CONTAINMENT
= NONE
ON  PRIMARY
( NAME = N'ReadOnlyDB', FILENAME = N'C:\Temp\ReadOnlyDB.mdf' , SIZE = 4024KB , FILEGROWTH = 1024KB )
LOG ON
( NAME = N'ReadOnlyDB_log', FILENAME = N'C:\Temp\ReadOnlyDB_log.ldf' , SIZE = 20480KB , FILEGROWTH = 10%)
GO
USE ReadOnlyDB
GO
-- Creating our table
CREATE TABLE tbl_SQLAuth (id INT, Longname CHAR(8000))
GO

Next let us go ahead and create a snapshot database out of our ReadOnlyDB.

-- Create a snapshot on our DB
CREATE DATABASE ReadOnlyDB_SS ON
( NAME = ReadOnlyDB, FILENAME =
'C:\Temp\ReadOnlyDB_data_1800.ss' )
AS SNAPSHOT OF ReadOnlyDB;
GO

Now that our snapshot database (ReadOnlyDB_SS) is created, it is important to know that snapshot databases are created as ReadOnly databases. The behavior is similar to creating ReadOnly Databases. To test this, let us try to insert some data into our Snapshot DB.

USE ReadOnlyDB_SS
GO
INSERT INTO tbl_SQLAuth VALUES (1, 'SQLAuth');
GO

Now we will be presented with the same familiar error (3906):

Msg 3906, Level 16, State 1, Line 25

Failed to update database “ReadOnlyDB_SS” because the database is read-only.

Even though we have not explicitly marked the snapshot database as Readonly we are getting this error. I was curious to know, what will happen if we try to mark the snapshot database as ReadOnly again. The error is self-explanatory.

-- Let us set the Snapshot DB as READ_ONLY
USE MASTER
GO
ALTER DATABASE [ReadOnlyDB_SS] SET READ_ONLY
GO

Msg 5093, Level 16, State 1, Line 31

The operation cannot be performed on a database snapshot.

Msg 5069, Level 16, State 1, Line 31

ALTER DATABASE statement failed.

If that was easy, what will happen if we try to make the snapshot database to read_write? The error is self-explanatory again here:

USE [master]
GO
ALTER DATABASE [ReadOnlyDB_SS] SET  READ_WRITE WITH NO_WAIT
GO

Msg 5093, Level 16, State 1, Line 37

The operation cannot be performed on a database snapshot.

Msg 5069, Level 16, State 1, Line 37

ALTER DATABASE statement failed.

Now that gives us two learnings, we cannot mark a Snapshot database as ReadOnly and nor can we mark a snapshot database as Read_Write enabled.

With these learnings in place, let us do the cleanup. Make sure to drop the Snapshot database before deleting the main database. So our cleanup script for this blog looks:

-- Clean up time
USE MASTER
GO
DROP DATABASE ReadOnlyDB_SS
GO
DROP DATABASE ReadOnlyDB
GO

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Configure, Monitor and Restore Backups from Mobile & Web browser

So tell me if this is a fantasy or a reality: You are on a beach, sipping on your cocktail and looking at the sea when you get a frantic call from your boss, “One of the junior DBAs has just deleted all of the data from a critical database”. You say “Hold on”, switch to a browser on your smartphone, go to your dashboard on SqlBak.com and click a Restore button on your latest transaction log backup. In a few seconds you get back to your boss and say: “I have fixed it”. Your boss is stunned thinking you are a magician and makes a note to give you a raise and a bonus while you continue enjoying the beach…

Well, the beach and the pay raise are a fantasy, but restoring your database in a couple of clicks through just a browser on your smartphone is a service you can use today – it is called SqlBak.

This is how it looks like – you select a backup job, click Restore and the latest backup will be restored to your SQL Server:

Let’s see how to make it works

First – boring stuff: go through 10 seconds signup with your Facebook, Twitter or Google account. Then (and here’s the secret to a magic of SqlBak working through a browser) – download and install a small service program to run on your SQL Server computer – SqlBak Client.

To connect this program to your SqlBak account, enter your “Secret Key” from SqlBak Dashboard

into the SqlBak Client (it will prompt you for it):

 

The service program (SqlBak Client) has practically no interface and after connecting it to your SqlBak account you can forget about it (it will also auto-update). All of the configuration, monitoring and restore interface will be online through SqlBak.com, not through the program directly.

Now the fun part begins.

Go to the SqlBak Dashboard and click “Add New Job” button.  Select the computer where you have previously installed and connected the SqlBak Client and authenticate with the SQL Server – exactly like you’d do in SSMS.

Press Continue and you get to the Job Settings page. Select the databases you want to backup:

Then Press “Add Backup Destinations” to select where you want to store the backups – you have an option of sending it to you Local/Network folder, FTP, Amazon S3, Dropbox, Google Drive, MS OneDrive or Azure Storage:

I’ve selected to send backups to Amazon S3 and a Network Folder:

Lets schedule the full backup to run every 6h and differential backup every 1 hours:

By default it takes standard backup files and compresses them to zip – lets keep it like that:

You would probably want to receive a confirmation email at least on any failure – so fill out that field:

Now save the job and press Run Now to see if everything runs correctly:

You would see a log like this:

Now we can forget about it – it will email us if anything is wrong. If we get back to it in a few days – heres what we would see a backup history like this:

For each successful backup you have the Download and Restore buttons. Let’s click on the Restore buttons to restore one of the Differential backup. Select the destination to restore from (Amazon S3 or a network folder in our case):

You will see a restore progress bar:

In a few minutes your database is restored.

Absolutely brilliant! I know of no other product that would allow me to do the same from just a browser. It took quite a few pictures to describe the process, but in reality it takes just a few minutes and pretty self-explanatory.

In addition to backups this service would monitor that your SQL Server is alive and would email you if it goes offline – very useful.

Companies that want to provide Sql Server private label backups themselves can configure the service so it is branded to their company.

Some companies with strict security requirements may be hesitant of storing SQL Server login credentials with SqlBak. I’ve contacted the developers and they have told me that they are working on a version that would give user a choice of entering login credentials on the client – this way they would never leave the SQL Server.

Conclusion: I have seen many SQL backup programs before, but I am seriously impressed with the ease of using SqlBak – the only SQL backup software where you can configure, monitor and restore backups right from the browser.

If you enter the promo code Pinal2015 on the Sign Up page and you will receive $20 credit to your account.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Finding Top Offenders in SQL Server 2012 – Notes from the Field #068

[Note from Pinal]: This is a 68th episode of Notes from the Fields series. Performance tuning gets easier with SQL Server 2012. For the first time, you can capture a workload and find your top offenders inside of Management Studio without having to write a single line of T-SQL or use any third party tools.

In this episode of the Notes from the Field series database expert John Sterrett (Group Principal at Linchpin People) explains a very common issue DBAs and Developer faces in their career – how to find top offenders in SQL Server. Linchpin People are database coaches and wellness experts for a data driven world. Read the experience of John in his own words.


One of the biggest mistakes with performance tuning is not knowing where to start.  Sometimes performance tuning can be like finding a needle in a haystack.  This is because you might not have a great process to help you find your top offenders. In this short five minute video below you will be able to utilize extended events to aggregate and group completed statements so you can find your top offenders without having to write any T-SQL code.

Are your servers running at optimal speed or are you facing any SQL Server Performance Problems? If you want to get started with the help of experts read more over here: Fix Your SQL Server.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – The Basics of the Execute Package Task – Notes from the Field #067

[Note from Pinal]: This is a new episode of Notes from the Field series. SQL Server Integration Service (SSIS) is one of the most key essential part of the entire Business Intelligence (BI) story. It is a platform for data integration and workflow applications.

In this episode of the Notes from the Field series I asked SSIS Expert Andy Leonard a very crucial question – What are the Basics of the Execute Package Task and where do we start with it? Andy was very kind to answer the questions and provides plenty of information about how a novice developer can learn SSIS from the beginning and become expert in the technology.


Learning how to configure and use the SQL Server Integration Services (SSIS) Execute Package Task gives you a great starting point for understanding SSIS package orchestration. I advocate writing small, functional SSIS packages that perform a unit of work. That’s a great idea for many reasons. But it begs the question: “How do I string together the execution of these packages?” Remember: SSIS is a software development platform. With “SQL Server” included in the name, it is easy for people to confuse SSIS as a database tool or accessory, but Control Flow Tasks put that confusion to rest.

SSIS provides several Control Flow tasks. Here is a list that provides a good approximation of which tasks I use most, from most-used to least-used:

In this article I provide a basic example of configuring the SSIS Execute Package Task, shown in Figure 1:


Figure 1: SSIS Execute Package Task

The Execute Package Task provides one way to implement an SSIS Design Pattern for SSIS package execution known as the Parent-Child pattern. When an SSIS package uses the Execute Package Task to start another SSIS package, the package with the Execute Package Task is called the Parent and the package started by the Execute Package Task is called the Child.

The Execute Package Task changed between SQL Server 2008 R2 Integration Services and SQL Server 2012 Integration Services. The changes support the SSIS 2012 (and 2014) Catalog. There is a new property called ReferenceType which defaults to “Project Reference” in packages executing in Project Deployment Mode (the default mode for building SSIS 2012 and SSIS 2014 SSIS packages), as shown in Figure 2:


Figure 2: Options for the Execute Package Task ReferenceType Property

Project Reference is used to execute an SSIS package – a Child package – in the same SSIS project with the package that contains the Execute Package Task – the Parent package. When Project Reference is selected, the next property in the property grid is PackageNameFromProjectReference, a dropdown containing a list of all the SSIS package in the SSIS project.

Setting the ReferenceType property to External Reference is a way to execute SSIS packages that are stored in the file system or the msdb database. When External Reference is selected the next properties in the property grid change to reflect this backwards-compatible functionality, as shown in Figure 3:


Figure 3: Setting the ReferenceType Property to External Reference

The Location and Connection properties are used to specify an OLE DB Connection (to the msdb database) or a File Connection (to the location of the dtsx file). SQL Server locations also require the name of the SSIS package; the File System option does not because the SSIS package is the only thing in the file. The External Reference ReferenceType setting is useful for importing SSIS solutions that contain earlier versions of the Execute Package Task because External Reference behaves exactly like previous versions of the Execute Package Task.

The remaining properties in the property grid – Password and ExecuteOutOfProcess – are shared between ReferenceType options. Password is used if the SSIS package is password-protected (if the ProtectionLevel property of the SSIS package is set to either EncryptSensitiveWithPassword or EncryptAllWithPassword). ExecuteOutOfProcess is a setting that determines whether the package will be executed as part of the current process or a new process.

When executing SSIS packages in Project Deployment Mode (using the Project Reference ReferenceType), Parameter Bindings are enabled and allow values to be passed from the Parent package to the Child package, as shown in Figure 4:


Figure 4: Binding a Parent Package Parameter to a Child Package Parameter

Parameter Bindings are disabled when the ReferenceType property is set to External Reference.

Once configuration is complete, click the OK button to close the Execute Package Task Editor. You can test execution by pressing the F5 key or selecting “Start Debugging” from the SSIS dropdown menu. A successfully-executed Execute Package Task will appear as shown in Figure 5:


Figure 5: A Successful Execution!

The Execute Package Task drives a powerful data integration architecture pattern: Parent-Child execution. Using SSIS Precedent Constraints and Sequence Containers with the Execute Package Task, a data integration developer can develop SSIS “driver” packages that call Child SSIS packages in any combination of parallel and serial orders of execution.

If you want to get started with SSIS with the help of experts, read more over at Fix Your SQL Server.

Reference: Pinal Dave (http://blog.sqlauthority.com)