SQL SERVER 2016 – Comparing Execution Plans

New tools bring new capabilities and sometimes these are ingrained deep into the product that we can easily miss them out. And SQL Server 2016 is no different. After installing the CTP, I was pleasantly surprised to see this addition to help our DBA’s who are doing performance tuning and query tuning activities.

So what am I talking about? Think of a scenario wherein you are doing some sort of testing in your QA environment. You seem to be troubleshooting a production scenario and found that very scenario to be working very well in your test environment. Though you have taken the data from the production database to the test QA server less than a month back, you are still baffled with the question – why this difference?

The next requirement everyone looks at is to start comparing the execution plans between your production and test environment for the same query. Some of the execution plans on a production server can run to pages sometimes and are difficult to find difference visually.

SQL Server 2016 – Plan comparison

If you have the execution plan from your production environment, then go ahead and open it in SQL Server 2016 Management Studio and right click to get the following option:

compare plan 00 SQL SERVER 2016   Comparing Execution Plans

You can see that an interesting addition. The “Compare Showplan” needs to be selected. Now, select the plan that was generated from your test environment. This will make bring both the execution plans on a side-by-side view as shown below. Look at the Purple color on the nodes.

compare plan 01 SQL SERVER 2016   Comparing Execution Plans

This represents the fact that both the nodes are same but have some differences in some properties. Now it is upto us to right click and bring the properties tab.

compare plan 02 SQL SERVER 2016   Comparing Execution Plans

Here we can go ahead and compare each of the node properties to find where the discrimination between the plans are. This is awesome capabilities added in the new version.

I am sure you will find this feature of great use in query tuning exercises in your environments. Having said that, one thing I have seen interesting is, if you take a live “Actual Execution Plan” and then try to compare it with a saved .sqlplan file – it will come up with an error as shown below:compare plan 03 SQL SERVER 2016   Comparing Execution Plans

Do let me know if you would use this feature and if you have used this feature – what is some of the enhancements you wished this comparison operator had? I would like to learn from you for sure.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Save and Send Execution Plan Via Email

Here is an interesting question, I received the other day.

Hi Pinal,

I have a very big graphical execution plan and I want to send it to my friend. My first thought was to send him query to execute on his own machine, but he does not have the same database which I am using. I also do not want to send my large database to him.

How can I save the execution plan to my computer and send it to him via email? Additionally, my email support only maximum of 20 MB, so please also guide me, what should I do if the size goes over 20 MB?


It is indeed a very interesting question. Here is the simplest way to do what the user is asking in the email.

Here is a quick tutorial.

Step 1: You can write any query and click on the icon on the top tool bar or type shortcut CTRL+M to enable actual management studio.

exec1 SQL SERVER   Save and Send Execution Plan Via Email

Step 2: Click on Execute Query

exec2 SQL SERVER   Save and Send Execution Plan Via Email

Step 3: It will execute the query and will bring up Execution Plan Tab. Click on the Execution Plan Tab.

exec3 SQL SERVER   Save and Send Execution Plan Via Email

Step 4: Upon clicking on Execution Plan it will bring up a query execution plan. Right Click over here and it will bring up a menu. In this menu click on the Save Execution Plan As…

exec4 SQL SERVER   Save and Send Execution Plan Via Email

Step 5: You can save your file with the extension.sqlplan.

exec5 SQL SERVER   Save and Send Execution Plan Via Email

That’s it! You are done. Now you can send the sqlplan via email to anyone. When someone opens the file, it will open in SQL Server Management Studio as displayed in the following image. The user has no need to have any database on their system. They can just see the data fine in the sqlplan.

exec6 SQL SERVER   Save and Send Execution Plan Via Email

You can open the sqlplan file in the notepad and you will see xml details for the plan.

exec7 SQL SERVER   Save and Send Execution Plan Via Email

Reference : Pinal Dave (http://blog.SQLAuthority.com)

Interview Question of the Week #039 – What is Included Column Index in SQL Server 2005 and Onwards?

You may find it interesting that not many people still know what is Included Index in SQL Server. I have seen it personally around me and also have seen at

Question: What is Included Column Index?


In SQL Server 2005, the functionality of non-clustered indexes is extended by adding non-key columns to the leaf level of the non-clustered index. Non-key columns, can help to create cover indexes.By including non-key columns, you can create non-clustered indexes that cover more queries. The Database Engine does not consider non-key columns when calculating the number of index key columns or index key size. Non-key columns can be included in non-clustered index to avoid exceeding the current index size limitations of a maximum of 16 key columns and a maximum index key size of 900 bytes. Another advantage is that using non-key column in the index we can have index data types not allowed as index key columns generally.

In following example column FileName is varchar(400), which will increase the size of the index key bigger than it is allowed. If we still want to include in our cover index to gain performance we can do it by using the Keyword INCLUDE.

USE AdventureWorks
CREATE INDEX IX_Document_Title
ON Production.Document (Title, Revision)
INCLUDE (FileName)

Non-key columns can be included only in non-clustered indexes. Columns can’t be defined in both the key column and the INCLUDE list. Column names can’t be repeated in the INCLUDE list. Non-key columns can be dropped from a table only after the non-key index is dropped first. For Included Column Index to exist there must be at least one key column defined with a maximum of 16 key columns and 1023 included columns.

Avoid adding unnecessary columns. Adding too many index columns, key or non-key as they will affect negatively on performance. Fewer index rows will fit on a page. This could create I/O increases and reduced cache efficiency. More disk space will be required to store the index. Index maintenance may increase the time that it takes to perform modifications, inserts, updates, or deletes, to the underlying table or indexed view.

Another example to test:
Create following Index on Database AdventureWorks in SQL SERVER 2005

USE AdventureWorks
ON Person.Address (PostalCode)
INCLUDE (AddressLine1, AddressLine2, City, StateProvinceID)

Test the performance of following query before and after creating Index. The performance improvement is significant.

SELECT AddressLine1, AddressLine2, City, StateProvinceID, PostalCode
FROM Person.Address
WHERE PostalCode BETWEEN '98000'
AND '99999';

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Script: Finding queries without JOIN Predicates

I have been in developer’s shoes for close to 5+ years before reinventing myself to do things that made me even better in coding. It is a tough challenge these days being a developer. The life around you and the technology is changing so rapidly that we seldom get time to take a pause from our busy schedule and learn something that has come new. This constant struggle to build new competencies, read documentations of newly released products is something I do religiously.

This blog can never come this far if I had not been doing this till date. The more I read and learn, the opportunities to share become better. To carry a baggage of legacy from time to time is inevitable and I see folks in the industry hold onto some of the practices even without knowing why they are doing so.

I am writing this code blog below because during a code review at one of the software companies, I saw a code that was not adhering to ANSI standards. I quickly jumped onto it and asked – why? The developer gave me the usual long answer of no time, release pressures and lastly – what is wrong, it works right?

I have respect for such pressures as you build your product, but writing a code block that performs and is easier to maintain is one of the critical step. After I explaining, I showed the warning in the execution plan that shows we have done something that can cause poor performance.

I quickly came up with this script to help the team identify all the queries that were written in this fashion and requested them to change the same at the earliest.

WITH XMLNAMESPACES('http://schemas.microsoft.com/sqlserver/2004/07/showplan' AS p)
FROM    (
FROM    sys.dm_exec_query_stats
ORDER BY total_worker_time DESC
) AS qs
CROSS APPLY sys.dm_exec_sql_text(qs.sql_handle) AS st
CROSS APPLY sys.dm_exec_query_plan(qs.plan_handle) AS qp
CROSS APPLY qp.query_plan.nodes( '//p:RelOp/p:Warnings[(@NoJoinPredicate[.="1"])]' ) AS q(n) ;

The warning is because, poorly performing queries are throwing warnings that there are operators without join predicates. That means somewhere in your query, you’ve logically written a CROSS JOIN without thinking about it. Maybe you are using an old style join syntax (SELECT * FROM a, b, c) or maybe you really did intend to write this code. Either way, a lot of rows can be produced as a result of this operation. It’s best to make sure this was your intent.

The only way out here is to re-write your query so you are always applying a join predicate. This is a script blog. If you run this in your environment, do you get anything? Find out and keep me posted.

Reference: Pinal Dave (http://blog.sqlauthority.com)

Interview Question of the Week #028 – Few SQL Best Practices

Just last week, I moderated a SQL Server interview and I heard very interesting conversation.

Question – What are the best practices are you following for SQL Server?

When I heard this question – I thought candidate will answer many different things, but he was stunned and had nothing much to add to the conversation. That reminded me one of my older articles which I believe still adds quite a lot of value to this conversation. Here is the article as an answer to the question.

Answer – 

1. Know your data and business application well.
Familiarize yourself with these sources; you must be aware of the data volume and distribution in your database.

2. Test your queries with realistic data.
A SQL statement tested with unrealistic data may behave differently when used in production. To ensure rigorous testing, the data distribution in the test environment must also closely resemble that in the production environment.

3. Write identical SQL statements in your applications.
Take full advantage of stored procedures, and functions wherever possible. The benefits are performance gain as they are precompiled.

4. Use indexes on the tables carefully.
Be sure to create all the necessary indexes on the tables. However, too many of them can degrade performance.

5. Make an indexed path available.
To take advantage of indexes, write your SQL in such a manner that an indexed path is available to it. Using SQL hints is one of the ways to ensure the index is used.

6. Understand the Optimizer.
Understand the optimizer how it uses indexes, where clause, order by clause, having clause, etc.

7. Think globally when acting locally.
Any changes you make in the database to tune one SQL statement may affect the performance of other statements used by applications and users.

8. The WHERE clause is crucial.
The following WHERE clauses would not use the index access path even if an index is available.
E.g. Table1Col1 (Comparison Operator like >, >=, <=,) Table1Col2, Table1Col1 IS (NOT) NULL, Table1Col1 NOT IN (value1, value2), Table1Col1 != expression, Table1Col1 LIKE ‘%pattern%’, NOT Exists sub query.

9. Use WHERE instead of HAVING for record filtering.
Avoid using the HAVING clause along with GROUP BY on an indexed column.

10. Specify the leading index columns in WHERE clauses.
For a composite index, the query would use the index as long as the leading column of the index is specified in the WHERE clause.

11. Evaluate index scan vs. full table scan. (Index Only Searches Vs Large Table Scan, Minimize Table Passes)
If selecting more than 15 percent of the rows from a table, full table scan is usually faster than an index access path. An index is also not used if SQL Server has to perform implicit data conversion. When the percentage of table rows accessed is 15 percent or less, an index scan will work better because it results in multiple logical reads per row accessed, whereas a full table scan can read all the rows in a block in one logical read.

12. Use ORDER BY for index scan.
The SQL Server optimizer will use an index scan if the ORDER BY clause is on an indexed column. The following query illustrates this point.

13. Minimize table passes.
Usually, reducing the number of table passes in a SQL query results in better performance. Queries with fewer table passes mean faster queries.

14. Join tables in the proper order.
Always perform the most restrictive search first to filter out the maximum number of rows in the early phases of a multiple table join. This way, the optimizer will have to work with fewer rows in the subsequent phases of join, improving performance.

15. Redundancy is good in where condition.
Provide as much information as possible in the WHERE clause. It will help the optimizer to clearly infer conditions.

16. Keep it simple, stupid.
Very complex SQL statements can overwhelm the optimizer; sometimes writing multiple, simpler SQL will yield better performance than a single complex SQL statement.

17. You can reach the same destination in different ways.
Each SQL may use a different access path and may perform differently.

18. Reduce network traffic and increase throughput.
Using T-SQL blocks over Multiple SQL statements can achieve better performance as well as reduce network traffic. Stored Procedures are better over T-SQL blocks as they are stored in SQL Server and they are pre-compiled.

19. Better Hardware.
Better hard ware always helps performance. SCACI drives, Raid 10 Array, Multi processors CPU, 64-bit operating system improves the performance by a great amount.

20. Avoid Cursors.
Using SQL Server cursors can result in some performance degradation in comparison with select statements. Try to use a correlated sub query or derived tables if you need to perform row-by-row operations.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Querying Performance Counters from SQL Server

Troubleshooting SQL Server is something almost every single DBA has to go through in their lifespan. It sounds simple and often we are clueless on where to start. Here is a scenario, you have been told that SQL Server is slow or unresponsive. Now you need to troubleshoot and figure out what is happening. Sounds simple right? What would you do in this situation?

When I asked this scenario to one of my DBA friend, their answer was simple. “Pinal, the first thing is to log onto the server and open Task Manager. Next I will see the basic parameters of how the CPU, Memory and IO are doing. Based on what I see at a macro level; we will troubleshoot accordingly.”

This seemed to be a simple yet a logical way to see things. On further query, they did mention about using PerfMon counters as an ideal way to capture some of the current problems running inside SQL Server. Capture the numbers and then troubleshoot later with other tools like Extended Events, Profiler, Activity Monitor etc. This conversation interested me big time because I wanted to see how people query performance counters.

The basic thing is to initialize Perfmon.exe and look at the various counters. We can also create a Data Collection template and go ahead with using the same. I am sure most of you are aware about doing the same.

Let me take an unconventional approach to a simple requirement to query Memory Grants pending on a SQL Server box using other techniques.

SQL to Query Performance Counter

In the below query, we use two methods. One to query to the Performance Counters and other via the DMV.

-- Get memory grants pending (perfmon counter)
SELECT cntr_value AS memory_grants_pending
FROM sys.dm_os_performance_counters
WHERE counter_name LIKE '%Memory%Grants%Pending%';
-- Get memory grants pending (DMV)
SELECT COUNT(*) AS memory_grants_pending
FROM sys.dm_exec_query_memory_grants
WHERE wait_time_ms IS NOT NULL;

This metric defines the total amount of memory grants that are waiting for workspace memory, and if this value is consistently above zero it could be a sign of memory pressure, and it could be negatively impacting the querying process. It is recommended that reevaluation of memory allocations and/or available physical memory is taken into account in order to relieve memory pressure for the current SQL Server instance where this query is being run.

If there are memory related errors, then we are likely to see 8645 errors in our SQL Server Errorlog. How to troubleshoot SQL Server error 8645 is explained in the KB.

Query Counter Using PowerShell

As we have done in various blog posts, we can also use PowerShell to query counters too. A simple script I wrote to query looks like below:

Get-Counter -Counter "\SQLServer:Memory Manager\Memory Grants Pending"| SELECT-Object @{Name = "Memory Grants Pending" Expression = {$_.CounterSamples.CookedValue}}

Please let me know if you have every used any of these techniques in your environment to automate querying values from Performance Counters. How do you use these counter values to proactively mitigate problems in your environments? What are some of the techniques that you have been using?

Reference: Pinal Dave (http://blog.sqlauthority.com)

Interview Question of the Week #021 – Difference Between Index Seek and Index Scan (Table Scan)

Question: What is the difference between Index Seek and Index Scan?

Answer: Index Scan retrieves all the rows from the table. Index Seek retrieves selective rows from the table.

Index Scan:
Since a scan touches every row in the table whether or not it qualifies, the cost is proportional to the total number of rows in the table. Thus, a scan is an efficient strategy if the table is small or if most of the rows qualify for the predicate.

Index Seek:
Since a seek only touches rows that qualify and pages that contain these qualifying rows, the cost is proportional to the number of qualifying rows and pages rather than to the total number of rows in the table.

Index Scan is nothing but scanning on the data pages from the first page to the last page. If there is an index on a table, and if the query is touching a larger amount of data, which means the query is retrieving more than 50 percent or 90 percent of the data, and then optimizer would just scan all the data pages to retrieve the data rows. If there is no index, then you might see a Table Scan  in the execution plan.

Here are few other related articles on this subject which you may find useful:

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Introduction to Database Engine Tuning Advisor (a.k.a. DTA)

While talking to one of the attendees after my session at GIDS, I realized that I don’t have much information available on my blog about the usage of one of the fantastic performance tuning tool called Database Engine Tuning Advisor. This is also called as Database Tuning Advisor or DTA in short.

The good thing about this tool is that it is part of the product itself. Once SQL Server Client Tools are installed, DTA is installed along with that. This tool has the capability to suggest index and statistics recommendations for a query given as input. You need not be an expert about query optimization to use this tool.

There are various ways to provide workload input. For a quick demonstration, I would use sample database “AdventureWorksDW2012” and tune a query. If you are not aware about this sample database, then you can read the information provided in my earlier blog.

SQL Authority News – Download and Install Adventure Works 2014 Sample Databases

Once AdventureWorksDW2012 is restored, let’s assume that we need to tune below query.

SELECT [ProductKey]
FROM [AdventureWorksDW2012].[dbo].[FactProductInventory]
WHERE [MovementDate] =  '2005-07-06'


There are multiple ways, we can provide input to DTA tool. The simplest way is to use management studio query windows and right click over there to choose “Analyze Query in Database Engine Tuning Advisor” as shown below.

DTA 01 SQL SERVER   Introduction to Database Engine Tuning Advisor (a.k.a. DTA)

Once that is clicked, the tool would open like below.

DTA 02 SQL SERVER   Introduction to Database Engine Tuning Advisor (a.k.a. DTA)

We need to choose the proper database in “Database for workload analysis:” and “Select databases and tables to tune”. In our example, we are using AdventureWorksDW2012 database so we will choose that in both the places. Once the database is selected we can directly use “Start Analysis” under “Actions” menu or press the button as well.

DTA 03 SQL SERVER   Introduction to Database Engine Tuning Advisor (a.k.a. DTA)

It would take some time to perform the analysis

DTA 04 SQL SERVER   Introduction to Database Engine Tuning Advisor (a.k.a. DTA)

…and finally provide recommendation(s).

DTA 05 SQL SERVER   Introduction to Database Engine Tuning Advisor (a.k.a. DTA)

One of the interesting piece which is highlighted in above image is “Estimated Improvements”. This the example which we have selected, we are seeing 99% improvement possible. The recommendation provided is to “create” an index.

To get that improvement, we can get the recommendations from Menu “Action” and choose either “Apply Recommendations” or “Save Recommendations”

DTA 06 SQL SERVER   Introduction to Database Engine Tuning Advisor (a.k.a. DTA)

Here is the recommendations file which I have saved and opened in SSMS.

DTA 07 SQL SERVER   Introduction to Database Engine Tuning Advisor (a.k.a. DTA)

Here is the comparison of query plan before and after recommendations.

Plan before Index

DTA 08 SQL SERVER   Introduction to Database Engine Tuning Advisor (a.k.a. DTA)

And here is the plan after index. We can see that clustered index scan has changed to non-clustered index seek.

 DTA 09 SQL SERVER   Introduction to Database Engine Tuning Advisor (a.k.a. DTA)

In future blogs, I would show some more example and ways to tune the workload by using database engine tuning advisor.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Using “High Performance” Power Plan for SQL Server

A lot of times, I have seen DBA’s and administrators have their own startup tasks that they perform when rebuilding or bringing a new server online. Some of these practices are powerful and are mean to enhance performance of the box that you have bought. Recently, I was at a customer location looking at what a typical Admin was doing and this step of his caught my attention. I couldn’t stop myself from blogging this because it was a simple, powerful and yet less appreciated setting available on the Server.

On Windows Server 2008 and above, set the “High Performance” power plan in Control Panel -> Power Options -> OK. By default, Windows Server sets the “Balanced” power plan, which enables energy conservation by scaling the processor performance based on current CPU utilization. From Intel X5500 and other last-generation CPUs, the clock is throttled down to save power (Processor P-state), and only increases when CPU utilization reaches a certain point. The Minimum and Maximum Processor Performance State parameters are expressed as a percentage of maximum processor frequency, with a value in the range 0 – 100.

If a server requires ultra-low latency, invariant CPU frequency, or the very highest performance levels, such as a database servers like SQL Server, it might not be helpful that the processors keep switching to lower-performance states. As such, the High Performance power plan caps the minimum processor performance state at 100 percent.

The typical setting looks like:

PowerPlan Windows Server SQL SERVER   Using High Performance Power Plan for SQL Server

At this point, thought it would be helpful in bringing out what these plans are for my readers. These are the built-in power plans and their common use case scenarios:

Plan Description Common applicable scenarios Implementation highlights
Balanced Default setting. Highest energy efficiency with minimum performance impact. General computing. Matches capacity to demand. Energy-saving features balance power and performance.
High Performance Increases performance at the cost of high energy consumption. Should not be used unless absolutely necessary. Low latency.Application code sensitive to processor frequency changes. Processors are always locked at the highest performance state.

I hope most of you are aware and are using these settings on your servers too. Do let me know some of the settings that you use as part of your daily environment. I am sure there will a few things I can learn from you too. Do drop a line as part of comments for the benefit of all.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – 8 Performance Related Articles on Logical Reads

Earlier this year, I have written few articles which are related to SQL Server Performance and they are published on site of Logical Reads. Here are my personal favorite 8 various articles on the subject of SQL Server Performance. I have really enjoyed writing them and I am very confident that you will like that as well. Let me know which one is your favorite article by leaving note in the comment area.

How SQL Server DMVs Can Be a Savior in SQL Azure, Too

Be it SQL Server or the improved version of SQL Server on the cloud called as SQL Azure, when it comes to monitoring, there is nothing as powerful as DMVs (Dynamic Management Views). For a primer, Dynamic Management Views (DMVs) are a set of virtual views which expose the internal states of SQL Server deployment. […]

Troubleshooting High CPU issues in SQL Server (Part 2)

If you ever ask a SQL Server DBA about most frequently faced performance issue in SQL Server, they would most likely say blocking or high CPU. A smart DBA would first confirm that high CPU is caused by SQL Server by looking at performance monitor. Once it is confirmed who is consuming high amount of […]

SQL Server Memory Buffer Pools: Understand the Basics

How many times you have heard this question: “Why is SQL Server consuming so much memory?” If you ask a DBA who knows the functionality of SQL Server, the answer would be, “That is an expected behavior! Limit max server memory of SQL Server if you are concerned.” One might then wonder why SQL Server doesn’t release […]

SQL Server Performance Tip: In-Memory OLTP

SQL Server 2014 introduced the new database engine enhancement called In-Memory OLTP. This feature uses new data structures which are optimized for in-memory access of table. People normally call it In-Memory database. In reality, we would like to call it partially in-memory database because SQL Server allows us to have few hot tables in-memory and […]

Troubleshooting High CPU Issues in SQL Server (Part 1)

Performance troubleshooting is one of the unique challenges every developer and DBA has to face. One of the major complaints about performance troubleshooting is utilization of resources. Few organizations resort to putting muscle power to scale up the system without analyzing the root cause of the performance problem. For example, a typical scenario might involve a SQL Server instance with memory […]

Understanding Recompiling Statements with SQL Server

Recently, I heard a story that was quite compelling. A retired naval officer used to get to the top of a bridge and keep waving his hand everyday around 8 PM. The passer-by used to think the elderly man had gone nuts and, after awhile, ignored his actions. One fine day, a stranger saw this and was […]

Two Options for Query Optimization with SQL Server

Working with SQL Server is always a challenge. As developers try to troubleshoot SQL Server performance problems, the first step that take is to look at the queries. This is the most basic step and most important step for most developers. Developers love these challenges of optimization because they can get the maximum visible performance improvements in their […]

3 Reasons You Should Move to Cloud

Cloud is the next buzz word in the industry. If we were to quiz any CIO on what are some of the big bets they want to make in the next 6-12 months–using cloud will surely be one of them. Why has this new concept has caught on like forest fire? Why are organizations exploring […]

Reference: Pinal Dave (http://blog.sqlauthority.com)