SQL SERVER – Delayed Durability Database Level and Transaction Level

Earlier I have written three articles on the concept of Delayed Durability. You can read them over here.

After reading an earlier blog post I received a question from a user regarding how database level and transaction level durability settings play with each other when they encounter different settings which are conflicting to each other. MSDN has already provided a very detailed summary of options and their interaction over here.

Though, the grid is very simple, I see lots of people getting confused with the same details. I will attempt to simplify the same grid over here.

  • If the transaction is cross database or distributed, it does not matter what durability settings you have, it is always Fully Durable Transaction.
  • If the transaction is the database level transaction -
    • It is fully durable when the database setting is Disabled for delayed durability
    • It is delayed durable when the database setting is Forced for delayed durability.
    • When the database setting is Allowed – it is delayed durable only if transaction setting is ON as well.

Well, if I have to make my own grid, I will make it as following for easy understanding

 

Please note that for cross database or distributed transaction’s durability is always set to fully durable, so I have not included in the image metrics.

Reference: Pinal Dave (http://blog.sqlauthority.com)

About these ads

SQL SERVER – Simple Example of Delayed Durability

Earlier I wrote two blog posts on Delayed Durability. Please read them before continuing this example, as they will give you good information about this concept before we start.

Now in this example we will see a working example of the same.

First, we will create a same database with the named DelayedDurability and once we create the same we will set the delayed durability to allow. Right after that we will create a dummy table inside it. Next we will create two stored procedures where one will have a delayed durability enable and one without delayed durability. We will see how much time each of the stored procedure takes to complete the task.

Task 1: Create database with delayed durability

-- Script for Delayed Durability
USE MASTER
GO
-- Create New Database
CREATE DATABASE [DelayedDurability]
GO
BACKUP DATABASE [DelayedDurability] TO DISK = 'NUL';
GO
-- Let us set Delayed Durability at DB level
USE [master]
GO
ALTER DATABASE [DelayedDurability] SET DELAYED_DURABILITY = ALLOWED WITH NO_WAIT
GO

Task 2: Create a dummy table

-- Creating Dummy table
USE [DelayedDurability]
GO
CREATE TABLE DummyTable
(ID INT IDENTITY PRIMARY KEY CLUSTERED,
SALARY VARCHAR(100))
GO

Task 3: Create two Stored Procedures

The first stored procedure is created without delayed durability enabled.

-- First SP with Delayed Durability OFF
CREATE PROCEDURE Simple_Insert
AS
BEGIN
SET NOCOUNT ON
DECLARE
@counter AS INT = 0
DECLARE @start DATETIME
SELECT
@start = GETDATE()
WHILE (@counter < 100000)
BEGIN
BEGIN TRAN
INSERT INTO
DummyTable VALUES( @counter)
SET @counter = @counter + 1
COMMIT WITH (DELAYED_DURABILITY = OFF)
END
SELECT
DATEDIFF(SECOND, @start, GETDATE() ) [Simple_Insert in sec]
END
GO

The second stored procedure is created with delayed durability enabled.

-- Second SP with Delayed Durability ON
CREATE PROCEDURE DelayedDurability_Insert
AS
BEGIN
SET NOCOUNT ON
DECLARE
@counter AS INT = 0
DECLARE @start DATETIME
SELECT
@start = GETDATE()
WHILE (@counter < 100000)
BEGIN
BEGIN TRAN
INSERT INTO
DummyTable VALUES( @counter)
SET @counter = @counter + 1
COMMIT WITH (DELAYED_DURABILITY = ON)
END
SELECT
DATEDIFF(SECOND, @start, GETDATE()) [DelayedDurability_Insert in sec]
END
GO

Task 4: Execute both the stored procedure and note down the result.

-- Now Execute both the SPs and check the time taken.
EXEC Simple_Insert
GO
EXEC DelayedDurability_Insert
GO

Result of SPs

Task 5: Clean up

-- Clean up
USE MASTER
GO
DROP DATABASE [DelayedDurability]
GO

Analysis of Result:

The SP/Query which ran with delayed durability ON performed better than other SP/Query which had delayed durability disabled. This is because we are executing a procedure which generated lots of log file and after a while query has to slow down if it is writing down the log file first before executing next set of instructions. However, if delayed durability is ON, the query will continue executing without worry of the log writing. This will improve performance but due to any reason there is unexpected shutdown, the data loss can happen.

You can learn more about the new features of SQL Server 2014 in my latest Pluralsight Course over here.

Note: Please do not enable delayed durability thinking will improve performance. I suggest you read my two blog posts listed over here and here before implementing it.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – How to Setup Delayed Durability for SQL Server 2014?

Yesterday we discussed about the Basics of Delayed Durability in SQL Server 2014. Today we will learn one more topic about this subject – How to Setup Delayed Durability for SQL Server 2014?

There are three methods to setup the Delayed Durability in SQL Server. Let us see each of them in detail.

Method 1: Database Level

You can enable, disable or force delayed durability at database level. Here is how you can do the same.

USE [master]
GO
-- Enable Delayed Durability for the database
ALTER DATABASE [AdventureWorks2014] SET DELAYED_DURABILITY = ALLOWED
GO

If you want your changes immediately take action, you can additionally use an option WITH NO_WAIT which will be implemented immediately.

USE [master]
GO
-- Enable Delayed Durability for the database
ALTER DATABASE [AdventureWorks2014] SET DELAYED_DURABILITY = ALLOWED
GO

Currently there are three different options with the SET DELAYED_DURABILITY.

  • Disabled: This is the default setting and very similar to full transaction durability.
  • Allowed: This option will allow each transaction to decide the delayed durability. Once this enables each transactions’s durability will be based on the transaction level level settings which will see later in this post.
  • Forced: This option will force each transaction to follow Delayed Durability.

There is one more thing we need to understand before we continue further down. When we set SET DELAYED_DURABILITY = ALLOWED it does not mean that each of the transactions are going to follow this durability. Allowed is just simply enabling the capability of the database to work with transactions which will have delayed durability. If you want each of your transactions to follow delayed durability you will have to execute the following statement.

USE [master]
GO
-- Enable Delayed Durability for the database
ALTER DATABASE [AdventureWorks2014] SET DELAYED_DURABILITY = FORCED
GO

You can disable delayed durability by executing the following statement.

USE [master]
GO
-- Enable Delayed Durability for the database
ALTER DATABASE [AdventureWorks2014] SET DELAYED_DURABILITY = DISABLED
GO

You can also change these values from SSMS as displayed in the image below.

Method 2: Transaction Level

Now that we have enabled the database level transactions, we can now use transaction level settings for delayed durability. Remember, if you have not enabled database level transaction, specifying transaction level durability will have no impact. You can specify the transaction level durability on the commit statement as following.

COMMIT TRANSACTION nameoftransaction WITH (DELAYED_DURABILITY = ON);

Method 3: Natively Compiled Stored Procedure

 You can use the similar settings for natively compiled stored procedures as well. Here is the example of the syntax.

CREATE PROCEDURE <procedureName> …
 WITH NATIVE_COMPILATION, SCHEMABINDING, EXECUTE AS OWNER
 AS BEGIN ATOMIC WITH
 (
 DELAYED_DURABILITY = ON,
 ...
 )
 END

Well, that’s it for today. In upcoming blog posts we will see how delayed durability impacts performance.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Basics of Delayed Durability in SQL Server 2014

SQL Server 2014 has introduced many new features and one of performance is Delayed Transaction Durability. This concept is indeed very interesting. To explain this feature in detail, we will require to understand what is Full Transaction Durability. The current default of SQL Server is Full Transaction Durability. A common question I often received is why would we go away from to delayed durability. The answer is – Performance prioritation over Durability.

I try to stay as brief as possible to explain various concepts over here. I will try to convert this blog post in questions, and answer format so it is easy to understand.

What is Full Transaction Durability?
In Full Transaction Durability transactions write the transaction log to disk before returning control to the client / user. This means your data is safe on disk before it is committed. SQL Server uses Write Ahead Log logic for this default behavior of Full Transaction Durability.

What is Delayed Transaction Durability?
In Delayed Transaction Durability transactions are written asynchronously and in batches to the disk before returning control to the client / user. This means your transaction log records are kept buffer and written to disk either when it is filled up or buffer flush happens.

When to use Full Transaction Durability?
You should full transaction durability when you can’t afford any data loss.

When to use Delayed Transaction Durability?
You should use Delayed Transaction Durability when performance is priority over data loss.

What particular sign I should observe in my SQL Server, which suggests that if I implement Delayed Durability it will improve my performance?
Well, if your performance issue is due to latency in transaction log writes, or in another word, if you notice the bottleneck on transaction log writes, you can implement delayed transaction durability.

When are transaction written to the disk in case of delayed transaction durability?
There are three different scenarios, when in-memory transaction logs are flushed to the disk (as per MSDN).

  • A fully durable transaction in the same database makes a change in the database and successfully commits.
  • The user executes the system stored procedure sp_flush_log successfully.
  • The in-memory transaction log buffer fills up and automatically flushes to disk. (Update: One of my SQL Expert friend suggest this statement requires amendment and it is on the way, I will update once I have more details).

Well, that’s it for today. In upcoming blog posts we will see a working demo for Delayed Transaction Durability.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL Authority News – FalafelCON 2014: 2 days with the Best Developers in the World

It is only two weeks away. On September 20-21, 2014 I will be in San Francisco presenting at an amazing conference of Falafel Conference 2014.  I love presenting at various forums on various technologies. I am extremely excited that I got invited to speak at Falafel Conference 2014 in San Francisco. I will present two technology sessions on SQL Server. If you are into web development or if you just want to attend a conference with the best of the industry speakers, this may be the right conference for you. What set apart this conference from other conference is technology presented as well as speakers. Usually one has to attend very expensive and high scale event when they have to hear good speakers. At this conference, you will find quite a many industry legends are available to present on the bleeding edge technology.

Click to Register Here!

The Unsung Hero
Abstract: Slow Running Queries are the most common problem that developers face while working with SQL Server. While it is easy to blame the SQL Server for unsatisfactory performance, however the issue often persists with the way queries have been written, and how Indexes has been set up. The session will focus on the ways of identifying problems that slow down SQL Server, and Indexing tricks to fix them. Developers will walk out with scripts and knowledge that can be applied to their servers, immediately post the session.

Secrets of SQL Server: Database Worst Practices
Abstract: Chances are you have heard, or even uttered, this expression. This demo-oriented session will show many examples where database professionals were dumbfounded by their own mistakes, and could even bring back memories of your own early DBA days. The goal of this session is to expose the small details that can be dangerous to the production environment and SQL Server as a whole, as well as talk about worst practices and how to avoid them. Shedding light on some of these perils and the tricks to avoid them may even save your current job. After attending this session, Developers will only need 60 seconds to improve performance of their database server in their SharePoint implementation. We will have a quiz during the session to keep the conversation alive. Developers will walk out with scripts and knowledge that can be applied to their servers, immediately post the session. Additionally, all attendees of the session will have access to learning material presented in the session.

Here are few of the reasons why I believe you should attend this conference:

  • Choose from four tracks covering Web, Mobile development and testing, Sitefinity, and Automated Testing, or attend sessions from all four!
  • Learn from the best developers and testers in the business in an intimate setting.
  • Surround yourself with your peers and the opportunity to network
  • Learn about the latest platforms and technologies, including Kendo UI, AngularJS, ASP.NET MVC, WebAPI, and more!

Here are the details for the sessions which I am going to present at Falafel Conference.

Register Now!

I have learned from the Falafel Team that they are running out of tickets and soon they will close the registration.  For next 10 days the price for the registration is only USD 199. Trust me, you can’t get such a world class training and networking opportunity at such a low price. Click to Register Here!

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – The Basics of the SSIS Execute SQL Task – Notes from the Field #044

[Note from Pinal]: This is a new episode of Notes from the Field series. SSIS is an interesting concept. There are more hidden features with SSIS than any other product. Additionally, there are many visible features which are not known to most of the people. Earlier I mention to SQL developer that they can execute SQL queries from SSIS they looked at me like they have seen the ghost. They always thought that they can only do the task in SSIS which are available to them in the tool box. Well, I think when I will get the similar question next time, I am going to forward them a link to this blog post. Andy has explained in very simple words how one can execute SQL Task in SSIS.


With “SQL Server” included in the name of SQL Server Integration Services (SSIS), it is easy for people to assume SSIS is simply a database tool or accessory. Let there be no mistake, SSIS is a software development platform. To give you a better understanding of how to use SSIS as a development platform, I’ll write a series of blogs that step through how to use Control Flow tasks. This first article will cover the Execute SQL Task.

When developing solutions with SSIS, I use a handful of Control Flow tasks:

  • Execute SQL Task
  • Data Flow Task
  • Script Task
  • Execute Package Task
  • File System Task
  • Execute Process Task

This list is a good approximation of which tasks I use most, too – from most-used to least-used. In this article I provide a basic example of configuring the SSIS Execute SQL Task, shown in Figure 1:


Figure 1: SSIS Execute SQL Task

Three things are required to configure an Execute SQL Task:

  1. Connection Type
  2. Connection Manager
  3. SQL Statement

Connection Type

The default Connection Type is OLE DB, as shown in Figure 2:


Figure 2: Available Connection Type Property values

I configure Execute SQL Tasks to execute SQL statements like truncate a table, update or insert a single row of data, or call a stored procedure. I typically use OLE DB or ADO.NET connection types; but I occasionally use ODBC connection types. (When executing parameterized statements I find ADO.NET offers a cleaner interface. I will cover parameters in another article).

After selecting a connection type, selecting the Connection Manager is next. The Execute SQL Task Editor filters the list of available connection managers, displaying only connection managers of the (connection) type configured in the previous step.

Figure 3 shows how to select a Connection Manager:


Figure 3: Selecting a Connection Manager

The last required step to configuring an SSIS Execute SQL Task is to supply an SQL Statement. There are two properties for this: SQLSourceType and SQLStatement. The SQL Source Type property specifies the source of the SQL Statement as one of the following:

  • Direct Input
  • File Connection
  • Variable

The SQL Statement can be entered manually (direct input). It can be stored in a file (file connection) or the SQL Statement can be stored in an SSIS variable (variable). In most cases you will manually enter the query as shown in Figure 4:


Figure 4: Entering the Query

Right-click the task and click “Execute Task” to test the task configuration. Or simply press the F5 key to execute the entire SSIS package, as shown in Figure 5:


Figure 5: Success!

You now know the basics of configuring an SSIS Execute SQL Task. Go code!

:{>

If you want to get started with SSIS with the help of experts, read more over at Fix Your SQL Server.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Live Plan for Executing Query – SQL in Sixty Seconds #073

Yesterday I wrote blog post based on my latest Pluralsight course on learning SQL Server 2014. I discussed newly introduced live plans. For query before you execute it, you can see the estimated execution plan and after you executed you can see the actual execution plan – however, how to see a query plan while the query is still being executed. This is a very interesting question. In SQL Server 2014 we have now a new DMV, which tracks execution stats at each operator level while the query is still running. The DMV for the same issys.dm_exec_query_profiles.

I hope my earlier blog post clearly explained how live plan for executing query works. If not, I suggest you watch following quick video where I explain this concept in extremely simple words.

You can download the code used in this video from Simple Demo of New Cardinality Estimation Features of SQL Server 2014.

Action Item

Here are the blog posts I have previously written. You can read it over here:

You can subscribe to my YouTube Channel for frequent updates.

Reference: Pinal Dave (http://blog.sqlauthority.com)