SQL SERVER – TechEd India 2013 Sessions and Relevent Pluralsight Courses

I am presenting at TechEd India 2013 two SQL Server session. You can read about my session in this blog post.

Yesterday I presented on topic SQL Server Performance Troubleshooting: Ancient Problems and Modern Solutions. Today I will be presenting on the subject Indexes – The Unsung Hero. If you are at TechEd India you must show up in my session – we will have fun talking about Indexes and performance tuning together. You can read about various details about the session over here.

However, if you are not at TechEd India 2013 and still want to know what I am going to cover in the session here is something you can do it. You can watch the following two of my Pluralsight courses which absolutely covers what I am going to talk about in TechEd India 2013.

SQL Server Performance: Introduction to Query Tuning 

SQL Server performance tuning is an art to master – for developers and DBAs alike. This course takes a systematic approach to planning, analyzing, debugging and troubleshooting common query-related performance problems. This includes an introduction to understanding execution plans inside SQL Server.

In this almost four hour course we cover following important concepts.

  • Introduction 10:22
  • Execution Plan Basics 45:59
  • Essential Indexing Techniques 20:19
  • Query Design for Performance 50:16
  • Performance Tuning Tools 01:15:14
  • Tips and Tricks 25:53
  • Checklist: Performance Tuning 07:13

The duration of each module is mentioned besides the name of the module.

SQL Server Performance: Indexing Basics

This course teaches you how to master the art of performance tuning SQL Server by better understanding indexes.

In this almost two hour course we cover following important concepts.

  • Introduction 02:03
  • Fundamentals of Indexing 22:21
  • Practical Indexing Implementation Techniques 37:25
  • Index Maintenance 16:33
  • Introduction to ColumnstoreIndex 08:06
  • Indexing Practical Performance Tips and Tricks 24:56
  • Checklist : Index and Performance 07:29

The duration of each module is mentioned besides the name of the module. You will have to login to watch above courses. 

So if you are going to miss my session at TechEd India, you still have a chance to catch up on what I am going to present by watching the Pluralsight courses listed above.

Reference: Pinal Dave (http://blog.sqlauthority.com)

About these ads

SQL SERVER – Introduction of Showplan Warning

Vinod Kumar M is my very good friend renowned SQL Server Expert. Vinod Kumar has worked with SQL Server extensively since joining the industry over a decade ago. Before joining Microsoft, he was a Microsoft MVP for SQL Server for more than 3 years. He now works with MTC as a Technology Architect. He is a well-known speaker at all major Microsoft and third party technical conferences. Here is a very interesting blog post he sent on the subject of Executon Plan. Don’t forget to visit a Vinod’s blog and follow him on twitter.


SQL Server is a cost based optimizer and can guide us at every step. Now assume you wanted to learn about Performance tuning and you had a teacher to teach this. It would be much easier for us to learn and most importantly we will learn the concepts, behavior much more quickly. On the same lines, in this post we are using “SQL Server” as our teacher and guide.

Previous Warnings

In versions prior to SQL Server 2012 we already have had a few warnings in the plan that can help us diagnose the slow query issue. Primarily there are two types of warnings.  They are query level warnings and operator level warning. In this blog post we will discuss some of the enhancements available with SQL Server 2012.

A typical example of Query level warning includes missing index we are used to seeing inside Management Studio after executing a query.

Operator level warnings are raised at operator level.  When SSMS sees this type of warning, it puts a yellow exclamation mark on the operator icon.  The figure below shows this type of warnings.

Prior to 2012,   there are two warnings at operator level.  “No Join Predicates” and “Missing Column Statistics”.   If you enable SET STATISTICS XML ON we can get “Missing Column Statistics” warning as:

<Warnings>
 <ColumnsWithNoStatistics>
 <ColumnReference Database="[missingstats]" Schema="[dbo]"
 Table="[missing_stats]" Column="name" />
 </ColumnsWithNoStatistics>
</Warnings>

New SQL Server 2012 Warnings

When we troubleshoot / tune a single query because it is slow, we normally just get the query plan. However, there are several trace events that have been very helpful in determining why the queries are slow.  “Hash Warning” and “Sort Warnings” trace events are very helpful in determining why a query is slow.

In SQL 2012, the ShowPlan will produce warnings if hash join or sort spill to tempdb because of low memory conditions. In this case, we don’t have to capture trace events just to see if there are any sort warnings or hash warning on the query. To understand this warning in a typical setup, here is the script. First let us get the objects required for this demo.

USE tempdb
GO
-- Preparing data
CREATE TABLE Plan_Warning
(id INT PRIMARY KEY CLUSTERED,
name VARCHAR(25), website CHAR(50))
GO
SET NOCOUNT ON
GO
-- Pumping in dummy Data
BEGIN TRAN
DECLARE
@counter INT
SET
@counter = 1
WHILE @counter <= 100000
BEGIN
INSERT INTO
Plan_Warning(id, name, website)
VALUES (@counter, 'Vinod Kumar' + CONVERT(VARCHAR(10), @counter),
'blogs.ExtremeExperts.com')
SET @counter = @counter + 1
END
COMMIT TRAN
PRINT
'Load completed ...'
GO

Now execute the next query to look at the execution Plan.

SET STATISTICS XML ON
GO
SELECT * FROM Plan_Warning
WHERE id >= 7000
ORDER BY name DESC OPTION (MAXDOP 1)
GO
SET STATISTICS XML OFF
GO

This executes the query and shows us the execution plan as below. Click on the Showplan resultset. Alternatively we can also use the Ctrl+M to get the actual execution plan if required.

The execution plan reads like this. And we can see a small warning symbol with the Sort Operator. Also in the warning section we can see it says the spill did happen to tempdb. This is fundamentally because we have queried more than 93k rows and it was not able to fit in our laptop memory.

If we open the XML file we can find the below node. This is the same shown visually.

<Warnings>
 <SpillToTempDb SpillLevel="1" />
 </Warnings>

This can happen for Hash Spills too. In this case we will see the warning on the Hash Join node with the same error of spill of data to tempdb. The SpillLevel for Hash Joins will be 2 in our XML.

The new warnings don’t get populated to legacy execution plans. In other words, if we set statistics profile on, we won’t get the plan with these details.

Final words

Though SQL Server shows the warnings, we have not explained what to do if you get warnings.

  • Spills are happening because we have selected more data in a single query than expected and this can be the case for reporting queries.
  • Try to add additional where clause to reduce the data from the big table.
  • When it is not possible to add additional where clause, we highly recommend to tune and monitor TempDB growth and contentions.

Hope you had fun learning something that SQL Server taught us. It is critical we keep exploring and learning looking for these fine prints with every single release of SQL Server.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Video – Beginning Performance Tuning with SQL Server Execution Plan

Traveling can be most interesting or most exhausting experience. However, traveling is always the most enlightening experience one can have. While going to long journey one has to prepare a lot of things. Pack necessary travel gears, clothes and medicines. However, the most essential part of travel is the journey to the destination. There are many variations one prefer but the ultimate goal is to have a delightful experience during the journey.

Here is the video available which explains how to begin with SQL Server Execution plans.

Performance Tuning is a Journey

Performance tuning is just like a long journey. The goal of performance tuning is efficient and least resources consuming query execution with accurate results. Just as maps are the most essential aspect of performance tuning the same way, execution plans are essentially maps for SQL Server to reach to the resultset. The goal of the execution plan is to find the most efficient path which translates the least usage of the resources (CPU, memory, IO etc).

Execution Plans are like Maps

When online maps were invented (e.g. Bing, Google, Mapquests etc) initially it was not possible to customize them. They were given a single route to reach to the destination. As time evolved now it is possible to give various hints to the maps, for example ‘via public transport’, ‘walking’, ‘fastest route’, ‘shortest route’, ‘avoid highway’. There are places where we manually drag the route and make it appropriate to our needs. The same situation is with SQL Server Execution Plans, if we want to tune the queries, we need to understand the execution plans and execution plans internals. We need to understand the smallest details which relate to execution plan when we our destination is optimal queries.

Understanding Execution Plans

The biggest challenge with maps are figuring out the optimal path. The same way the  most common challenge with execution plans is where to start from and which precise route to take. Here is a quick list of the frequently asked questions related to execution plans:

  • Should I read the execution plans from bottoms up or top down?
  • Is execution plans are left to right or right to left?
  • What is the relational between actual execution plan and estimated execution plan?
  • When I mouse over operator I see CPU and IO but not memory, why?
  • Sometime I ran the query multiple times and I get different execution plan, why?
  • How to cache the query execution plan and data?
  • I created an optimal index but the query is not using it. What should I change – query, index or provide hints?
  • What are the tools available which helps quickly to debug performance problems?
  • Etc…

Honestly the list is quite a big and humanly impossible to write everything in the words.

SQL Server Performance:  Introduction to Query Tuning

My friend Vinod Kumar and I have created for the same a video learning course for beginning performance tuning. We have covered plethora of the subject in the course. Here is the quick list of the same:

  • Execution Plan Basics
  • Essential Indexing Techniques
  • Query Design for Performance
  • Performance Tuning Tools
  • Tips and Tricks
  • Checklist: Performance Tuning

We believe we have covered a lot in this four hour course and we encourage you to go over the video course if you are interested in Beginning SQL Server Performance Tuning and Query Tuning.

Reference: Pinal Dave (http://blog.SQLAuthority.com)

SQL SERVER – A Quick Look at Logging and Ideas around Logging

This blog post is written in response to the T-SQL Tuesday post on Logging. When someone talks about logging, personally I get lots of ideas about it. I have seen logging as a very generic term. Let me ask you this question first before I continue writing about logging.

What is the first thing comes to your mind when you hear word “Logging”?

Now ask the same question to the guy standing next to you. I am pretty confident that you will get  a different answer from different people. I decided to do this activity and asked 5 SQL Server person the same question.

Question: What is the first thing comes to your mind when you hear the word “Logging”?

Strange enough I got a different answer every single time. Let me just list what answer I got from my friends. Let us go over them one by one.

Output Clause

The very first person replied output clause. Pretty interesting answer to start with. I see what exactly he was thinking. SQL Server 2005 has introduced a new OUTPUT clause. OUTPUT clause has access to inserted and deleted tables (virtual tables) just like triggers. OUTPUT clause can be used to return values to client clause. OUTPUT clause can be used with INSERT, UPDATE, or DELETE to identify the actual rows affected by these statements.

Here are some references for Output Clause:

Error Logs

I was expecting someone to mention Error logs when it is about logging. The error log is the most looked place when there is any error either with the application or there is an error with the operating system. I have kept the policy to check my server’s error log every day. The reason is simple – enough time in my career I have figured out that when I am looking at error logs I find something which I was not expecting. There are cases, when I noticed errors in the error log and I fixed them before end user notices it. Other common practices I always tell my DBA friends to do is that when any error happens they should find relevant entries in the error logs and document the same. It is quite possible that they will see the same error in the error log  and able to fix the error based on the knowledge base which they have created. There can be many different kinds of error log files exists in SQL Server as well – 1) SQL Server Error Logs 2) Windows Event Log 3) SQL Server Agent Log 4) SQL Server Profile Log 5) SQL Server Setup Log etc.

Here are some references for Error Logs:

Change Data Capture

I got surprised with this answer. I think more than the answer I was surprised by the person who had answered me this one. I always thought he was expert in HTML, JavaScript but I guess, one should never assume about others. Indeed one of the cool logging feature is Change Data Capture. Change Data Capture records INSERTs, UPDATEs, and DELETEs applied to SQL Server tables, and makes a record available of what changed, where, and when, in simple relational ‘change tables’ rather than in an esoteric chopped salad of XML. These change tables contain columns that reflect the column structure of the source table you have chosen to track, along with the metadata needed to understand the changes that have been made.

Here are some references for Change Data Capture:

Dynamic Management View (DMV)

I like this answer. If asked I would have not come up with DMV right away but in the spirit of the original question, I think DMV does log the data. DMV logs or stores or records the various data and activity on the SQL Server. Dynamic management views return server state information that can be used to monitor the health of a server instance, diagnose problems, and tune performance. One can get plethero of information from DMVs – High Availability Status, Query Executions Details, SQL Server Resources Status etc.

Here are some references for Dynamic Management View (DMV):

Log Files

I almost flipped with this final answer from my friend. This should be probably the first answer. Yes, indeed log file logs the SQL Server activities. One can write infinite things about log file. SQL Server uses log file with the extension .ldf to manage transactions and maintain database integrity. Log file ensures that valid data is written out to database and system is in a consistent state. Log files are extremely useful in case of the database failures as with the help of full backup file database can be brought in the desired state (point in time recovery is also possible). SQL Server database has three recovery models – 1) Simple, 2) Full and 3) Bulk Logged. Each of the model uses the .ldf file for performing various activities. It is very important to take the backup of the log files (along with full backup) as one never knows when backup of the log file come into the action and save the day!

Can I just say I loved this month’s T-SQL Tuesday Question. It really provoked very interesting conversation around me.

Reference: Pinal Dave (http://blog.SQLAuthority.com)

SQL SERVER – Using SafePeak to Accelerate Performance of 3rd Party Applications

An exciting solution I found last year (2011) for SQL Server performance acceleration is SafePeak. Designed to specifically to accelerate and tune performance of cases where you have minimum control on the applications, like 3rd party line of business applications. SafePeak performs automated caching of queries and procedures results, returning with very high speed results from memory and reducing the SQL load by factor of 10. No code changes needed. And that is make it very interesting and appealing!

One of the questions I hear many times concern performance acceleration of 3rd party applications applications that are critical to business function, the “line of business” applications. In most cases critical software remain intact for many years, growing and expanding but also losing its performance edge and affecting the business for the worse, instead for the good. The main problem with fixing and tuning 3rd party apps is that it is a closed code or someone else’s code. Those applications are waiting for an upgrade every few years, while the business being affected for the worse.

SafePeak idea: a middleware software that studies SQL Server schema (tables, views, stored procedures, functions, triggers) to learn about object dependencies, and then creates patterns of similar SQL queries and procedures executions while understanding their direct and indirect object dependents. And then it starts actively caching results in RAM, while keeping those update commands (DML, DDL and procs that do DML or DDL) to both effect the cache and the SQL Server. After downloading it the installation is on a standard Windows 2008 R2 virtual machine or a server (different then the SQL server) takes several minutes. Connecting the application is done by pointing the connection string to SafePeak (easy). Configuration is done via their very nice looking GUI and focuses on things SafePeak didn’t understood, which are usually procedures that have some dynamic code inside.

Results can be very exciting: cached queries response time is reduced to 0.25 millisecond or so. Yes, I meant millisecond, pretty impressive. So queries that are slow become extremely fast and queries that are already fast become even faster.

Because it no code change is involved this specifically fits well 3rd party applications – plug it in, play with it for couple of hours of fine-tuning and see the results.

SafePeak supports SQL 2000, 2005, 2008 and 2008 R2 and can run in a private cloud or the public cloud.

SafePeak provides a 14 days trial with all functions available for a single installation (they also have a safepeak cluster installation, which I plan to review in near future). The download is available here: www.safepeak.com/download – Good luck and tell me your stories.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQLAuthority News – Online Session Practical Tricks and Tips to Speed up Database Queries Today

I am presenting on performance tuning topic again today at Virtual Tech Days. This time I am going to talk about lots of practical tips and will focus on what we can do immediately right after the session is over.

During the session I have two things for you to spot.

  1. How many times, I say word “performance”?
  2. How many times, I use the phrase “It is interesting to …”?

Let us see if you can tell me after the session the count. Trust me, I am not going to count there as I will be presenting so I let you come up with the answer of this fun game.

Sessions Details

Title: Practical Tricks and Tips to Speed up Database Queries
Timing: December 15, 2011 1:45pm – 2:45pm IST
In this session I am going to discuss various performance tuning related techniques. Here is the agenda of the session.

  • A Quick Start on Performance
  • Denali T-SQL Enhancements
  • Timeless Best Practices
  • The Superlative Performance

Each of the above topics will be associated with very practical solid demo. I am sure you will absolutely enjoy the session.

Giveaways

During my session I will ask a simple question. I will give away 5 copies of my SQL Server Interview Questions and Answers books to five random person who will answer it correctly (more details in the session). If you have already have this book, I strongly suggest you attend this session as this session will take the performance tuning concepts to next level. I will make sure that I autograph and send this copies to your way.

Reference : Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – A Quick Look at Performance – A Quick Look at Configuration

This blog post is written in response to the T-SQL Tuesday post of Tips and Tricks. For me, this is a very interesting subject. I perfectly enjoy a discussion when it is about performance tuning. I commonly get follow-up questions regarding this subject, but most of them do not give the complete information about their environment.

Whenever I get a question which does not have complete information but is obviously requesting for my help, my initial reaction is to ask more questions. When I ask more details, I usually get more questions from them rather than the details I was asking for. Indeed, this is an endless loop. I prefer to resolve a query or a problem quickly, easily and accurately so that there is no more confusion or further problems in the future.

Here is the method I follow: I send people who request my help a couple of SQL Server scripts and ask them to run these scripts on their system. Once they send me the results,  I would then have a good idea on what the status of their system is.

Here are a couple of scripts which I usually request them to run on their respective machines and get back to me with results in Excel.

1) System Configuration

SELECT *
FROM sys.configurations
ORDER BY name OPTION (RECOMPILE);

2) Filename and Paths of Database

SELECT DB_NAME([database_id])AS [DBName],
name, physical_name, type_desc, state_desc,
CONVERT( bigint, size/128.0) [SizeinMB]
FROM sys.master_files
ORDER BY DB_NAME([database_id])

3) Capturing Wait Types and Wait Stats Information at Interva

You can review the script mentioned in my earlier blog post over here.

Honestly, there is a lot of information one needs to solve a query, but this is how I start and get all the basic information from the questioner. Once I get these, I review the results and continue to ask more questions or help right away if I am able to reach the root cause of the issue.

Reference: Pinal Dave (http://blog.SQLAuthority.com)