Interview Question of the Week #024 – What is the Best Recovery Model?

Here is one of the most popular questions I often see people asking on the internet.

Question: What is the best recovery model for my database?

Answer: Every situation is different and each situation has different needs for the recovery model.

SQL Server offers three recovery models: full recovery, simple recovery and bulk-logged recovery. The recovery models determine how much data loss is acceptable and determines whether and how transaction logs can be backed up.

Select Simple Recovery Model if:
* Your data is not critical.
* Losing all transactions since the last full or differential backup is not an issue.
* Data is derived from other data sources and is easily recreated.
* Data is static and does not change often.

-- Set the recovery model to Simple

Select Bulk-Logged Recovery Model if:
* Data is critical, but logging large data loads bogs down the system.
* Most bulk operations are done off hours and do not interfere with normal transaction processing.
* You need to be able to recover to a point in time.

-- Set the recovery model to Bulk Logged

Select Full Recovery Model if:
* Data is critical and no data can be lost.
* You always need the ability to do a point-in-time recovery.
* Bulk-logged activities are intermixed with normal transaction processing.
* You are using replication and need the ability to resynchronize all databases involved in replication to a specific point in time.

-- Set the recovery model to Full

You can switch from any recovery model to another recovery model, but prior to or after the switch, you may need to issue additional transaction log or full backups to ensure you have a complete backup set.

Reference: Pinal Dave (

SQL SERVER – Know Your Backup Before Deleting Database

Earlier last week I wrote a blog around, SQL SERVER – FIX – Msg 3702, Level 16, State 3 – Cannot Drop Database “DB_Name” Because it is Currently in Use. The premise of that blog was driven by some of the demo’s I show at conferences. During one of the UG Meets, I met my good friend Balmukund doing something different and that inspired me to write that blog.

When I wrote that, many people did write back stating it can be dangerous etc. I sort of agree if you are on a production box. As my usecase was based on Demo environment for sessions, lesser did I think about it. Having said that, immediately I thought it would be good to back-it-up with a blog that will prevent us from getting into trouble.

So I wrote back to my friend who wrote an email to me about the dangers. Yes, we need to be careful while dropping databases but be prepared for contingencies. If you have taken a backup of your database, make sure to check if the same is available by querying the MSDB. I gave a typical script would look like this:

CASE msdb..backupset.TYPE
'D' THEN 'Database'
WHEN 'I' THEN 'Differential'
WHEN 'F' THEN 'Filegroup'
END AS backup_type,
FROM msdb.dbo.backupmediafamily
INNER JOIN msdb.dbo.backupset
ON msdb.dbo.backupmediafamily.media_set_id = msdb.dbo.backupset.media_set_id
-- Add the WHERE condition if you want it for a specific database
-- WHERE msdb.dbo.backupset.database_name = 'AdventureWorks2014'
ORDER BY msdb.dbo.backupset.backup_finish_date


The above script is a classic way to find all the backups done for a given database or the databases on a given instance.

As a best practice, I would like to figure out from these DMV’s if any backups were taken on every single database before working on them. I am sure as a seasoned DBA, you all are always aware of this important steps when working on production databases.

I know each organization has a restore strategy, so can you let me know about your restore strategies for critical databases? When do you take FULL, Differential and how often you take TLog backups in your production environments? Sharing this is a great way to tell the blog readers about typical usage patterns.

Reference: Pinal Dave (

SQL SERVER – Backup Failure – Error: 3203, Severity: 16, State: 1

In the recent past, I have been writing about different error messages on SQL Server that one encounters. Thanks to my blog readers who have been sharing a lot of errors with me either by email or by blog comments and asking for solution. If I find something interesting and more helpful to other, I write a blog for that. This is one of such blog.

2015-05-05 10:00:00.440 Backup       Error: 3203, Severity: 16, State: 1.
2015-05-05 10:00:00.440 Backup       Read on “M:\MSSQL\TLog\Production_1.ldf” failed: 1(Incorrect function.)
2015-05-05 10:00:00.440 Backup       Error: 3041, Severity: 16, State: 1.
2015-05-05 10:00:00.440 Backup       BACKUP failed to complete the command BACKUP LOG ProductionDB. Check the backup application log for detailed messages.

When I was searching for other emails with error message, I found another one as below.

2014-01-05 16:07:38.19 Backup   Error: 3203, Severity: 16, State: 1.
2014-01-05 16:07:38.19 Backup   Read on “M:\DatabaseName_DefaultData.ndf” failed: 1117(failed to retrieve text for this error. Reason: 15100)

This error would vary based on the actual cause. In first error message, it is operating system error 1 and in second case, it is operating system error 1117. To convert any operating system error number to text, we can use windows net command (net helpmsg) from command prompt as shown below

nethelpmsg 01 SQL SERVER   Backup Failure   Error: 3203, Severity: 16, State: 1

If you notice the second message, we are not seeing the text of the message 1117 but we are seeing “failed to retrieve text for this error. Reason: 15100”

What you should do in case of such error?

You should start looking at system and application event log based in operating system error. The example error which I have shown above are due to hardware level issues. Here is what we found in event log:

Log Name: System
Source: Disk
Date: 5/5/2015 09:32:11 AM
Event ID: 51
Task Category: None
Level: Warning
Keywords: Classic
User: N/A
Computer: SQLServerMachine
Description: An error was detected on device \Device\Harddisk4\DR4 during a paging operation.

Here is an article which explained event ID 51

Have you ever seen any similar errors? Put into the comment section to help others and share your knowledge. The best thing about these errors is that we get to learn from each other interesting concepts.

Reference: Pinal Dave (

SQL SERVER – Recovering from Snapshots – Notes from the Field #078

[Note from Pinal]: This is a 78th episode of Notes from the Fields series. What do you do when data is deleted accidentally or you are facing disaster? Well, there are plenty of the things, you can do, but when in panic quite often people make mistakes which just intensify the disaster. Database snapshot is very important but less frequently used feature.

JohnSterrett SQL SERVER   Recovering from Snapshots   Notes from the Field #078

In this episode of the Notes from the Field series database expert John Sterrett (Group Principal at Linchpin People) explains a very interesting subject of how to recover the database from snapshots. Linchpin People are database coaches and wellness experts for a data driven world. Read the experience of John in his own words.

One of the most common – and most forgotten – scenarios in disaster recovery plans is data being updated or deleted accidentally. This surprises me because, in the field, I see accidental data changes as a common disaster. If you have enterprise edition you can leverage database snapshots as a step in your disaster recovery plan to help recover faster, or at least allow you to pull back the majority of your data while you concurrently restore backups (to bring the rest of your data back). Restoring most of your data quickly can be critical with very large databases (VLDBs), especially if you have tight Recovery Time Objectives (RTOs).

Database snapshots require sparse files and store the original pages in these files when they are changed. Therefore, you will want to manage the sparse files and make sure you purge database snapshots as needed. You might also be using database snapshots without even knowing it. If you are doing database integrity checks with the default values, a special snapshot is taken in the background and your integrity checks are executed against that snapshot.


The following example uses the AdventureWorks2014 sample database. We need to use the CREATE DATABASE syntax, include the names of your data files, and include the file location of the sparse files. Finally, include AS SNAPSHOT OF database syntax to define the database as a snapshot.

CREATE DATABASE [AW2014_Snapshot_morning] ON
( NAME = AdventureWorks2014_Data, FILENAME =
AS SNAPSHOT OF AdventureWorks2014


Here is a common case of a mistaken DELETE statement. We have a DELETE statement with the primary key included to delete a single row. By mistake we didn’t highlight the filter so all the rows will be deleted.

If you execute this statement, you will see an unexpected result: we deleted all rows in the table as shown below.

notd 78 SQL SERVER   Recovering from Snapshots   Notes from the Field #078

(19972 row(s) affected)


You have two options if you created a database snapshot earlier.

First, you could insert the data back from your snapshot database as shown below. This could be done with SSIS, BCP or many other tools. For this quick example we will do this with an INSERT INTO SELECT statement.

Person.EmailAddress (BusinessEntityID, EmailAddressID, EmailAddress, rowguid, ModifiedDate)
FROM AW2014_Snapshot_morning.Person.EmailAddress

Second, you can revert the database from the snapshot. Keep in mind this second option will revert all data changes in the database not just the data deleted in your accidental data change statement.

DATABASE_SNAPSHOT = 'AW2014_Snapshot_morning';

From these examples, you can see database snapshots are a tool to help you recover data quickly. Please note that you wouldn’t want database snapshots to be your sole plan for disaster recovery and unplanned data change statements. If your snapshot or the original database suffers corruption, you wouldn’t be able to recover. So make sure you add snapshots into your existing disaster recovery plans, which should – at a minimum – include database backups to give you a better recovery point objective.

Are your servers running at optimal speed or are you facing any SQL Server Performance Problems? If you want to get started with the help of experts read more over here: Fix Your SQL Server.

Reference: Pinal Dave (

SQL SERVER – How to View the Dirty Pages In Memory of a Database?

I always get an opportunity to talk on topics that are basic from time to time. As I was preparing for GIDS 2015 (Great Indian Developer Summit) this year, I was vividly remembering one of the conversations I had with an attendee last year at our Pluralsight booth. Most of these conversations are forgotten, but this one was unique. During this conversation, the attendee was trying to learn the basics and was curious to listen to me. I always get carried away and spend quite some time with folks who are eager to learn. In this context, I was talking how RDBMS’s use the concept of Write-Ahead Transaction Log for  maintaining ACID properties. The response I got was interesting – “Hey Pinal, is there a way to know which of the pages in memory are currently dirty? Do you know how to get this?”

As I was preparing for this year’s session, this question was on top of my mind and I thought let me take a moment in this blog to show you how this can be achieved. Again, everything we do inside SQL Server can be accessed by the Dynamic Management views and it is important for us to know which ones to use.

T-SQL Scripts

Create the dummy database for test and we will enable few traceflags for this demo.

USE DirtyPagesDB
-- Disable automatic checkpoint so that data isn't flushed
DBCC TRACEON(3505, -1);

Let us create a table and enter some values. We will also flush the page using Checkpoint.

CREATE TABLE t1 (Speaker_Bio CHAR(8000))
INSERT INTO t1 VALUES ('SQL'),('Authority')

Let us check if there are any pages dirty currently in our buffer. Since we have done a checkpoint, there will be no rows for the below query.

-- Get the rows of dirtied pages
database_name =,
index_id =
1 THEN p1.index_id
WHEN 2 THEN p2.index_id
WHEN 3 THEN p1.index_id
FROM sys.dm_os_buffer_descriptors bd
INNER JOIN sys.databases d
ON bd.database_id = d.database_id
INNER JOIN sys.allocation_units au
ON bd.allocation_unit_id = au.allocation_unit_id
LEFT JOIN sys.partitions p1
ON au.container_id = p1.hobt_id
LEFT JOIN sys.partitions p2
ON au.container_id = p2.partition_id
LEFT JOIN sys.objects o1
JOIN sys.objects o2
WHERE is_modified = 1
AND = 'DirtyPagesDB'
( = 't1'
OR = 't1'

Since we got no rows, let us insert a row to check.


Go ahead and run the above big query again. You are likely to see 2 rows, one for IAM and one for Datapage. If you run the CHECKPOINT, these will also get flushed to the data disk.

-- Clean up

This was in my mind for a long time to research about this and write. I am sure this will be a good learning for my blog readers too. Do let me know if this was an interesting learning today for you.

Reference: Pinal Dave (

SQL SERVER – Script – Removing Multiple Databases from Log Shipping

Have you ever come across a situation where you have large number of databases in log shipping and you have to remove all of them? If you use SQL Server Management Studio, it would take a long time because you have to Right Click on each database, go to properties, choose Transaction Log Shipping tab, choose remove for secondary, uncheck the box and then hit OK. Though monotonous, these are painful when the number of databases we are really huge.

In the background, it executes stored procedures to remove the metadata from the log shipping related tables in MSDB database.

Below is the sample which runs on primary. I have changed input parameters.

-- primary
EXEC MASTER.dbo.sp_delete_log_shipping_primary_secondary
@primary_database = N'Primary_Database_Name'
,@secondary_server = N'Secondary_Server_Name'
,@secondary_database = N'Secondary_Database_Name'
EXEC MASTER.dbo.sp_delete_log_shipping_primary_database @database = N'Primary_Database_Name'

And below runs on secondary (here also I have changed input parameter)

-- secondary
EXEC MASTER.dbo.sp_delete_log_shipping_secondary_database
@secondary_database = N'Seconday_Database_Name'

Essentially, if we want to remove log shipping we need to get primary database name, secondary database name and secondary server name. I have used metadata table to find that details.

-- Script for removal of Log Shipping from primary
SELECT @ExecString = 'EXEC master.dbo.sp_delete_log_shipping_primary_secondary
@primary_database = N'''
+ pd.primary_database +'''
,@secondary_server = N'''
+ ps.secondary_server+ '''
,@secondary_database = N'''
+ ps.secondary_database + ''''
FROM   msdb.dbo.log_shipping_primary_secondaries ps,
msdb.dbo.log_shipping_primary_databases pd
WHERE ps.primary_id = pd.primary_id
SELECT @ExecString
SELECT @ExecString = 'EXEC master.dbo.sp_delete_log_shipping_primary_database @database = N'''+primary_database+'''
FROM msdb.dbo.log_shipping_primary_databases
SELECT @ExecString

Once you run the script, you would get output with execute statement, just copy paste and run into new query window.

Here is the similar script for secondary server.

-- Script for removal of LS from Secondary
SELECT @ExecString = 'EXEC master.dbo.sp_delete_log_shipping_secondary_database @secondary_database = N'''+secondary_database+'''
FROM log_shipping_secondary_databases
SELECT @ExecString

Note: Above scripts would generate which you need to copy paste in new query window on respective servers.  Please verify before running the output on production server.

Trying to automate and use the power of T-SQL is one of the best things I have always felt. Do let me know if you did these sort of things in your environments? Can you share some of the experiences?

Reference: Pinal Dave (

SQL SERVER – The Basics of the File System Task – Part 2 – Notes from the Field #075

[Note from Pinal]: This is a new episode of Notes from the Field series. SQL Server Integration Service (SSIS) is one of the most key essential part of the entire Business Intelligence (BI) story. It is a platform for data integration and workflow applications.

andyleonard SQL SERVER   The Basics of the File System Task   Part 2   Notes from the Field #075

In this episode of the Notes from the Field series I asked SSIS Expert Andy Leonard a very crucial question – What are the Basics of the File System Task and where do we start with it? Andy was very kind to answer the questions and provides plenty of information about how a novice developer can learn SSIS from the beginning and become expert in the technology.

Many data integration scenarios involve reading data stored in flat files or performing extracts from a relational (or legacy) system into flat files. Learning how to configure and use the SQL Server Integration Services (SSIS) File System Task will support your efforts when loading data to and from flat files. In a previous article, I described configuring the File System Task to archive a file. In this article, I will repeat the exercise, but I will add flexibility (and complexity – the two always go together) by using SSIS Variables to manage the Source File and Destination Directory locations. This article is an edited version of The Basics of the File System Task, Part 1. I chose to write it this way for those who find this article but haven’t read Part 1.

Remember: SSIS is a software development platform. With “SQL Server” included in the name, it is easy for people to confuse SSIS as a database tool or accessory, but Control Flow Tasks put that confusion to rest.

SSIS provides several Control Flow tasks. Here is a list that provides a good approximation of which tasks I use most, from most-used to least-used:

In this article I provide an advanced example of configuring the SSIS File System Task, shown in Figure 1:

notes 75 1 SQL SERVER   The Basics of the File System Task   Part 2   Notes from the Field #075
Figure 1: SSIS File System Task

The File System Task provides one way to implement an SSIS Design Pattern for source file archival. When you first open the File System Task Editor, you will note several properties in the property grid. Whenever you see an Operation property in an SSIS task editor, know that that property drives the other property selections. Options for the Operation property of the SSIS File System Task are shown in Figure 2:
notes 75 2 SQL SERVER   The Basics of the File System Task   Part 2   Notes from the Field #075
Figure 2: SSIS File System Task Operation Property Options

The Operation options are:

  • Copy directory
  • Copy file (default)
  • Create directory
  • Delete directory
  • Delete directory content
  • Delete file
  • Move directory
  • Move file
  • Rename file
  • Set Attributes

I stated the Operation property drives the other property selections. Take a look at the File System Task Editor when I change the Operation option from “Copy file” (Figure 2) to “Delete file” as shown in Figure 3:
notes 75 3 SQL SERVER   The Basics of the File System Task   Part 2   Notes from the Field #075
Figure 3: The File System Task Editor with the “Delete file” Operation Selected

See? There are less properties required for the “Delete file” operation. The available properties are even more different for the “Set Attributes” operation, shown in Figure 4:
notes 75 4 SQL SERVER   The Basics of the File System Task   Part 2   Notes from the Field #075
Figure 4: The File System Task Editor with the “Set Attributes” Operation Selected

The Operation property changes the editable properties, exposing some and hiding others. With flexibility come complexity. Even though the File System Task is complex, I’ve found the task is stable and extremely useful. Let’s look at a practical example; using the File System Task to archive a flat file.

To begin configuring the SSIS File System Task for file archival, select the “Move file” operation as shown in Figure 5:
notes 75 5 SQL SERVER   The Basics of the File System Task   Part 2   Notes from the Field #075
Figure 5: SSIS File System Task with the “Move file” Operation Selected

Using the IsSourcePathVariable and IsDestinationPathVariable properties extends the flexibility of the File System Task and further changes the list of available properties in the property grid, as shown in Figure 6:
notes 75 6 SQL SERVER   The Basics of the File System Task   Part 2   Notes from the Field #075
Figure 6: Opting to Use Variables for Source and Destination Paths

Note the SourceConnection and DestinationConnection properties are hidden and the SourceVariable and DestinationVariable properties are available in their place. Click the SourceVariable property dropdown, and click “<New variable…>” as shown in Figure 7:
notes 75 7 SQL SERVER   The Basics of the File System Task   Part 2   Notes from the Field #075
Figure 7: Selecting “<New variable…>” from the SourceVariable Property

When the Add Variable window displays, enter “SourceFilePath” for the variable name property and a full path to your source file in the Value textbox, as shown in Figure 8:
notes 75 8 SQL SERVER   The Basics of the File System Task   Part 2   Notes from the Field #075
Figure 8: Configuring the SourceFilePath SSIS Variable

Click the OK button to close the Add Variable window and return to the File System Task Editor. Click the DestinationVariable property dropdown, and then click “<New variable…>” to open a new Add Variable window. Configure the new variable by setting the Name property to “DestinationFolder” and the Value property to a location you wish to move the file, as shown in Figure 9:
notes 75 9 SQL SERVER   The Basics of the File System Task   Part 2   Notes from the Field #075
Figure 9: Configuring the DestinationFolder SSIS Variable

Click the OK button to close the Add Variable window and return to the File System Task Editor. You have configured an SSIS File System Task to move a file using SSIS Variables to manage the source and destination of the file, as shown in Figure 10:
notes 75 10 SQL SERVER   The Basics of the File System Task   Part 2   Notes from the Field #075
Figure 10: An SSIS File System Task Configured to Move a File Using SSIS Variables

The SSIS File System Task is now configured to archive a file. Let’s test it! Click the OK button to close the File System Task Editor. Press the F5 key or select SSIS->Start Debugging to test your work. My result is shown in Figure 11:
notes 75 11 SQL SERVER   The Basics of the File System Task   Part 2   Notes from the Field #075
Figure 11: Successful Test Execution of the SSIS File System Task

Viewing the source and destination directories, we see the file was successfully moved – shown in Figure 12:
notes 75 12 SQL SERVER   The Basics of the File System Task   Part 2   Notes from the Field #075
Figure 12: The File, Moved!

One tricky part when configuring the SSIS File System Task to move a file is realizing that you need to select the actual file for the source and the directory for the destination.

As I stated earlier, the SSIS File System Task is powerful, flexible, and robust. This article has demonstrated another way you can use the File System Task to archive files. Archiving files after loading the data they contain is a common practice in data integration.

If you want to get started with SSIS with the help of experts, read more over at Fix Your SQL Server.

Reference: Pinal Dave (

SQL SERVER – Using MaxTransferSize parameter with SQL Server Backups

Off late I have been writing a lot around SQL Server backups and administration topics. I have seen a lot of my readers have been enjoying some of the deep conversations I have with people in the industry. As I always say, most of the blog posts are inspired by someone asking me a question, me doing the research and then reaching out to them to give a solution. This blog post is no different in that aspect.

Recently I was fortunate to be in Delhi for a session to one of our customers. As many have seen me around, one of the DBA came and asked me an interesting question. The organization had bought a new EMC storage and one of the recommendations was to use a transfer size of 256 KB. The first instinct for me – “Is it? I didn’t know that.” I said I would revert back to them after I reach the hotel and do my bit of research.

On reading a number of documentation and recommendations on the various sites. I got to know, the optimal value for the MaxTransferSize parameter for SQL Server backups for EMC storage systems is 256 KB, while by default SQL server uses a value of 1,024 KB (1 MB). To promote storage array stability and performance, full database, differential and transaction log backups must be performed with MaxTransferSize specified at 262,144 bytes (256 KB).

Well, that was easier said than done. I now wanted to know how this can be done because the DBA wanted to automate this using their scripting inside a SQL Server Agent Job. I turned to MSDN for help and as always it didn’t disappoint me.

Below are some sample T-SQL Backup commands that are optimized for this requirement:

/* Full database backup */
BACKUP DATABASE AdventureWorks TO DISK = 'C:\MS SQL\Bakup Files\AdventureWorks.bak'
/* Differential database backup */
BACKUP DATABASE AdventureWorks TO DISK = 'C:\MS SQL\Bakup Files\AdventureWorks.diff'
/* Transaction Log backup */
BACKUP LOG AdventureWorks TO DISK = 'C:\MS SQL\Bakup Files\AdventureWorks.trn'

To my blog readers, I would like to know if you have ever used these settings in your environments. What type of storage are you using and what values are you using for MAXTRANSFERSIZE parameter? Have you seen any performance improvements while using the same? Have you seen any difference in behavior while using these parameters? Do let me know via the comments section.

Finally, as I wrap up – I always talk about the power of using T-SQL over UI and when I write such blog posts, this just gets better. I need to learn the fineprints of using the T-SQL commands and all the parameters in future. Lest assured, you are going to see more of these in this blog for sure.

Reference: Pinal Dave (

SQL SERVER – Backup and Restore Behavior of ReadOnly Filegroup Databases

Last week I wrote about SQL SERVER – Marking Filegroup as ReadOnly with SQL Server and it got me interested into few key questions. One of the questions someone asked was, how will backups behave? Are there anything I need to know about ReadOnly Filegroups? Will these databases when restored take this setting along? So in this blog post, let me take few simple steps in the learning journey I had when working with ReadOnly filegroups.

TSQL Scripts

Let us start out by creating the database first.
( NAME = N'ReadOnlyDB', FILENAME = N'C:\Temp\ReadOnlyDB.mdf' , SIZE = 4024KB , FILEGROWTH = 1024KB )
( NAME = N'ReadOnlyDB_FG', FILENAME = N'C:\Temp\ReadOnlyDB_FG.ndf' , SIZE = 4096KB , FILEGROWTH = 1024KB )
( NAME = N'ReadOnlyDB_log', FILENAME = N'C:\Temp\ReadOnlyDB_log.ldf' , SIZE = 20480KB , FILEGROWTH = 10%)
-- Mark the filegroup read-only

I have gone ahead by marking the filegroup as Read_Only. Next I am going ahead with a FULL Backup.

TO  DISK = N'C:\Temp\ReadOnlyDB.bak'
NAME = N'ReadOnlyDB-Full Database Backup',

Next we will drop the database and restore the same.

-- Clean up time
USE [master]

In the above command, we restored the database back to the same location. Now let us go ahead and check the filegroup’s settings for read_only attribute.

USE ReadOnlyDB
-- Check the status
SELECT type_desc, physical_name, is_read_only
FROM sys.database_files

You can see our secondary filegroup is still marked as read_only.

readonlydb backup 01 SQL SERVER   Backup and Restore Behavior of ReadOnly Filegroup Databases

There is one important learning that I got in this experiment. The database fileroups that are marked as read_only is retained as part of fullbackup and when we restore such backups, these settings get carried forward in our restored database.

I hope you got some learning as part of this experiment. Are you using these concepts in your environments? Do let us know via the comments below.

Reference: Pinal Dave (

SQL SERVER – ReadOnly Databases with ReadOnly File Attribute

As part of continued series of ReadOny databases, the earlier blog had an interesting comment that read as some people were getting errors of 5120 and others. This got me curious and was in the pending list of solutions in some way.

One of the readers had mailed me saying, they attached a database and the databases came up as ReadOnly. They didn’t know why this happened and when they try to mark the database as ReadWrite, it was giving them an error.

Here are the steps to reproduce the situation. I have the database files in a fileshare. We will try to attach the same using the below command:

USE [master]
( FILENAME = N'C:\ReadOnlyDB\ReadOnlyDB.mdf'),
FILENAME = N'C:\ReadOnlyDB\ReadOnlyDB_log.ldf')

Note: In your case, you can have database files on a different folder.

As soon as the command is executed, we can see in the object explorer the following:

ReadOnly file 01 SQL SERVER   ReadOnly Databases with ReadOnly File Attribute

We can see that the database is marked as Read-Only on attach. This was strange. So based on the blog, let us try to make the database as ReadWrite using the following command:

USE [master]

Now this raised the following error:

Msg 5120, Level 16, State 101, Line 36
Unable to open the physical file “C:\ReadOnlyDB\ReadOnlyDB.mdf”. Operating system error 5: “5(Access is denied.)”.
Msg 5181, Level 16, State 5, Line 36
Could not restart database “ReadOnlyDB”. Reverting to the previous status.
Msg 5069, Level 16, State 1, Line 36
ALTER DATABASE statement failed.

The catch to the error is hidden in the message. It says “Operating system error”. So let us go ahead and detach the DB to change the OS File attributes.

USE [master]
EXEC MASTER.dbo.sp_detach_db @dbname = N'ReadOnlyDB'

Next get to the File system attribute for the database and Log files to check for the ReadOnly attribute. In our example, the mdf file had this attribute enabled.

ReadOnly file 02 SQL SERVER   ReadOnly Databases with ReadOnly File Attribute

Please go ahead and uncheck the “Read-only” attribute from the file. That is it. Now attach the database after the change.

USE [master]
( FILENAME = N'C:\ReadOnlyDB\ReadOnlyDB.mdf'),
FILENAME = N'C:\ReadOnlyDB\ReadOnlyDB_log.ldf')

This got us back to normal life as far as our ReadOnly databases is concerned. Hope you learnt something new as I am learning something new almost every single day. Do let me know if you got a chance to learn so much working with ReadOnly databases in the past.

Reference: Pinal Dave (