SQL SERVER – Puzzle with MONTH Function – Win USD 50 Amazon Gift Card

It has been a while since we ran the contest. I reached out to kind team of Embarcadero and they agreed to support the next contest. The contest has two steps and they are very simple. It took me a while to build contest, but it is a really fun one. I am very confident that once you try out the contest, you will love it.

Two Giveaways:

amazon gift cards SQL SERVER   Puzzle with MONTH Function   Win USD 50 Amazon Gift Card

(Global) USD 50 Amazon Gift Card to 1 Individual

(India) INR 2500 Flipkart Gift Card to 1 Individual

Contest 1: T-SQL

Run following T-SQL script and observe the answer:

SELECT MONTH(18/200), MONTH(200/18)

monthcontest SQL SERVER   Puzzle with MONTH Function   Win USD 50 Amazon Gift Card

When we look at the answer, it displays 1 as a answer to both of the expression. The question is why does above script return values as 1 instead of throwing an error or displaying any error?

Contest 2: Download and Install DBArtisan

This is the easy part of the contest. You just have to download and install DBArtisan. You can download DBArtisan from here.

How to Participate?

  • Leave an answer for contest 1 in the comment section of the blog.
  • Leave a comment with the same email address which you have used to download DBArtisan.
  • The contest is open till June 2nd, 2015 Midnight PST.
  • The winner will be announced on June 4nd, 2015.
  • There will be two winners 1) Global 2) India.
  • All the correct answer to the contest will be hidden till the winner is announced.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Antivirus Exclusions Best Practices With SQL Server

One of the classics as far as best practices is concerned is to exclude SQL Server files and folders on your antivirus programs running on the server. We all know this statements and often I see DBAs not adding any extra care to this recommendation. There are not many blogs that call them out explicitly.

Although antivirus software is a very important part of security planning, it is important to understand the effect it has on SQL Server files.  After rebooting a server, if the antivirus software locks a SQL Server file before SQL Server can gain access to that file, potential issues ranging from SQL Server not being able access those particular files to possible database corruption may occur. In a number of cases I have seen SQL Server refusing to start sometimes or the CPU / IO seems to be stalling sometimes. As you can see there are a number of these random errors one might get because of not implementing a simple best practice. Therefore, it is recommended that the following files be excluded from all antivirus scans:

  • SQL Server data files (typical extension is .mdf, .ndf, and .ldf)
  • SQL Server backup files (typical extension is .bak and .trn)
  • Full-text catalog files
  • Trace files (typical extension is .trc)
  • SQL Audit files
  • SQL query files (typical extension is .sql)
  • Directories holding Analysis Services data and partitions
  • The directory that holds Analysis Services temporary files used during processing
  • Analysis Services backup files

In addition to excluding SQL Server and Analysis Services files, it is recommended to exclude the following list of processes from antivirus scans:

  • SQLServr .exe
  • ReportingServicesService .exe
  • MSMDSrv .exe

For environments where SQL Server is clustered, exclude the C:\Windows\Cluster directory and the Quorum drive.

Though this list is not exhaustive, I am sure there might be a few more that I might have missed. This was a humble shot at this topic and I am sure as seasoned DBA’s you might have a little more to include. Do let me know under the comments section and I am sure this can be a great source of reference for others searching on this topic too.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL Authority News – SQL Server 2014 Service Pack 1 Available for Download

Last month, Microsoft announced SQL Server 2014 Service Pack 1 and it had issues with SSIS catalog. Microsoft has once again released SQL Server 2014 Service Pack 1 earlier today and it is now available to download. I hope all the issue with earlier version has been taken care and the version is reliable to apply to production server. I strongly suggest that one should wait for a while before going ahead and applying the service pack. It is a good idea to wait and see if the version which we are installing is stable and there are not bugs. I always install, service packs on my development server before I install them on production servers.

Here are the links to download SQL Server 2014 Service pack 1

Reference: Pinal Dave (http://blog.sqlauthority.com)

Interview Question of the Week #020 – What is the Difference Between DISTINCT and GROUP BY?

Question: What is the Difference Between DISTINCT and GROUP BY?

Answer: A DISTINCT and GROUP BY usually generate the same query plan, so performance should be the same across both query constructs. GROUP BY should be used to apply aggregate operators to each group. If all you need is to remove duplicates, then use DISTINCT. If you are using sub-queries execution plan for that query varies so in that case you need to check the execution plan before making decision of which is faster.

USE AdventureWorks2014
GO
-- Example of DISTINCT:
SELECT DISTINCT JobTitle, Gender
FROM [HumanResources].[Employee]
GO

interview 20 1 Interview Question of the Week #020   What is the Difference Between DISTINCT and GROUP BY?

USE AdventureWorks2014
GO
-- Example of GROUP BY:
SELECT JobTitle, Gender
FROM [HumanResources].[Employee]
GROUP BY JobTitle, Gender
GO

interview 20 2 Interview Question of the Week #020   What is the Difference Between DISTINCT and GROUP BY?

USE AdventureWorks2014
GO
-- Example of GROUP BY with aggregate function:
SELECT JobTitle, Gender, COUNT(*) EmployeeCount
FROM [HumanResources].[Employee]
GROUP BY JobTitle, Gender
GO

interview 20 3 Interview Question of the Week #020   What is the Difference Between DISTINCT and GROUP BY?

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Service Pack Error – The User Data directory in the registry is not valid. Verify DefaultData key under the instance hive points to a valid directory

As a good SQL Server DBA, I always keep updating my SQL Server on my all VMs with the latest service pack available. During my last installation of service pack 2 on SQL Server 2012, I got below error.

The User Data directory in the registry is not valid. Verify DefaultData key under the instance hive points to a valid directory.

I remember that I have played with some registry keys and now I realized that they are important keys. I looked into the logs based on MSDN articles and found below in Summary.txt file.

Detailed results:
Feature: Database Engine Services
Status: Failed: see logs for details
Reason for failure: An error occurred during the setup process of the feature.
Next Step: Use the following information to resolve the error, and then try the setup process again.
Component name: SQL Server Database Engine Services Instance Features
Component error code: 0x851A0043
Error description: The User Data directory in the registry is not valid. Verify DefaultData key under the instance hive points to a valid directory.

Then I looked in the Detail.txt and found below:

(01) 2015-05-13 09:19:09 SQLEngine: --SqlEngineSetupPrivate: <strong>Validating path:E:\MSSQL\Data </strong>
(01) 2015-05-13 09:19:09 Slp: Configuration action failed for feature SQL_Engine_Core_Inst during timing Validation and scenario Validation.
(01) 2015-05-13 09:19:09 Slp: The User Data directory in the registry is not valid. Verify DefaultData key under the instance hive points to a valid directory.
(01) 2015-05-13 09:19:09 Slp: The configuration failure category of current exception is ConfigurationValidationFailure
(01) 2015-05-13 09:19:09 Slp: Configuration action failed for feature SQL_Engine_Core_Inst during timing Validation and scenario Validation.
(01) 2015-05-13 09:19:09 Slp: Microsoft.SqlServer.Configuration.SqlEngine.ValidationException: The User Data <strong>directory in the registry is not valid. Verify DefaultData key under the instance hive points to a valid directory.</strong>
(01) 2015-05-13 09:19:09 Slp: at Microsoft.SqlServer.Configuration.SqlEngine.SqlEngineSetupPrivate.CheckIfDirectoryExistsGeneric(String sqlPath, SqlEngineErrorCodes invalidError)
(01) 2015-05-13 09:19:09 Slp: at Microsoft.SqlServer.Configuration.SqlEngine.SqlEngineSetupPrivate.ValidateDataRegistryKeys(EffectiveProperties properties)
(01) 2015-05-13 09:19:09 Slp: at Microsoft.SqlServer.Configuration.SqlEngine.SqlEngineSetupPrivate.Patch(ConfigActionTiming timing, Dictionary`2 actionData, PublicConfigurationBase spcb)
(01) 2015-05-13 09:19:09 Slp: at Microsoft.SqlServer.Configuration.SqlConfigBase.SlpConfigAction.ExecuteAction(String actionId)
(01) 2015-05-13 09:19:09 Slp: at Microsoft.SqlServer.Configuration.SqlConfigBase.SlpConfigAction.Execute(String actionId, TextWriter errorStream)
(01) 2015-05-13 09:19:09 Slp: The following is an exception stack listing the exceptions in outermost to innermost order
(01) 2015-05-13 09:19:09 Slp: Inner exceptions are being indented
(01) 2015-05-13 09:19:09 Slp:
(01) 2015-05-13 09:19:09 Slp: Exception type: Microsoft.SqlServer.Configuration.SqlEngine.ValidationException
(01) 2015-05-13 09:19:09 Slp: Message:
(01) 2015-05-13 09:19:09 Slp: The User Data directory in the registry is not valid. Verify DefaultData key under the instance hive points to a valid directory.

I search for DefaultData key in registry and found at below location

HKLM\SOFTWARE\Microsoft\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQLServer

RegistryData 01 SQL SERVER   Service Pack Error   The User Data directory in the registry is not valid. Verify DefaultData key under the instance hive points to a valid directory

As we can see, it is pointing to E:\MSSQL\Data and it was not valid. Later I played with various other keys and found below possible errors.

Invalid Values in Registry Error Message
DefaultData The User Data directory in the registry is not valid. Verify DefaultData key under the instance hive points to a valid directory.
DefaultLog The User Log directory in the registry is not valid. Verify DefaultLog key under the instance hive points to a valid directory.
SQLDataRoot The Database Engine system data directory in the registry is not valid.

SQLDataRoot in locating under HKLM\SOFTWARE\Microsoft\Microsoft SQL Server\MSSQL11.MSSQLSERVER\Setup

The cause of all errors was invalid path, which I have messed up. Later I search on the Microsoft site and found that there is a connect item filed by someone

https://connect.microsoft.com/SQLServer/feedback/details/778275/sql-server-2012-sp1-installation-issue

Moral of the story is never playing directly with the registry unless you are sure what they are. I was able to fix the error by correcting the value in DefaultData registry key.

Have you encountered such errors and were able to look at the logs?

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – WARNING – SQL Server Has Encountered N Occurrence(s) of I/O Requests Taking Longer Than 15 Seconds

This is one of the most commonly asked questions via blog comments and emails sent across to me. Here is the sample error message which has been reported in SQL Server ERRORLOG.

SQL Server has encountered 201 occurrence(s) of I/O requests taking longer than 15 seconds to complete on file [T:\MSSQL\DATA\tempdb.mdf] in database id tempdb [2].  The OS file handle is 0x0000000000000770.  The offset of the latest long I/O is: 0x0000000008c000

If you are not a SQL Expert and don’t know what ERRORLOG is, no need to worry. Here is the blog to check location of SQL Server ERRORLOG

SQL SERVER – Where is ERRORLOG? Various Ways to Find its Location

Coming back to the error message, the text which is highlighted in bold would vary based on the problem. The sample error message is for tempdb database which has database ID = 2. Path of the MDF file is T drive in the sample message. Some DBA also refer this as a stalled IO. In the sample test we are seeing 201 occurrences but I have seen value going till 5 digits also.

So, what’s the cause of the error message? Is there any problem with SQL Server? Most of the time answer is NO because SQL is reporting warning in Errorlog when it is not able to complete the IO in even in 15 seconds. If you have read any blog or book covering internals of SQL Server, you might know that SQL Server does asynchronous IO, using the win32 APIs like ReadFile, WriteFile, ReadFileScatter and WriteFileGather. One IO is posted, the thread would return and do meaningful work rather than waiting for IO to complete. The threads would keep checking the pending IO and report if they have taken longer.

Here are the thing which a DBA should do before going to hardware team.

1. Note the pattern of the message. If they are logged at any specific time? Check if there is any scheduled maintenance job is running at the same time? Rebuild index and other IO intensive tasks would perform a lot of IO and disk may report slowness at that time.

2. Check Event log (system and application) and check if there are any disk related error or warning messages.

3. Keep monitoring sysprocesses and check for waits which are happening for SPIDs. If the disk is slow all the times then you should see waits of PAGEIOLATCH, WRITELOG or LOGBUFFER with high wait time. Refer my earlier blogs.

LOGBUFFER Explanation

WRITELOG Explanation

 IO15Sec 01 SQL SERVER   WARNING   SQL Server Has Encountered N Occurrence(s) of I/O Requests Taking Longer Than 15 Seconds

4. Capture disk related data using peformance monitor (start > run > perfmon) for extended amount of time and have a look at counters. You should look at the performance counters related to storage performance to evaluate whether the numbers you are seeing are expected and within reasonable bounds.

Avg Disk Sec/Transfer: This counter is available under Logical Disk and Physical Disk object. The value of this counter should be less than 0.020 (=20 ms). Same counter is available for Read and Write also and can be used to find performance difference between reads and writes.

Disk Bytes/sec:  SAN’s generally start from 200-250 MB/s these days (note that the counter value is in bytes so we need to divide that by 1024). Same counter is available for read and write also.

Disk Transfers/sec: this counter represents Number of read and write performance also known as IOPS (I/O per second). This can be used to compare against capacity of storage subsystem. Read and write counter also available.

5. Make sure latest Windows patch has been applied to avoid any known issues with windows itself.

Once basics have been verified and we have confirmed that we are seeing symptoms of slow response from disk, hardware team need to be engaged for further investigation.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – SSIS and How to Load Binary Large Objects, or Blobs – Notes from the Field #080

[Note from Pinal]: This is a new episode of Notes from the Field series. Every time I give an introductory note, however, this time there is no need of intro note. This note is from Andy and as we all know he is amazing person when we have to understand the fundamentals. He has written this blog post with such an interesting way that you must read it to understand the very basic fundamental of SSIS.

andyleonard SQL SERVER   SSIS and How to Load Binary Large Objects, or Blobs   Notes from the Field #080


I still remember my first experience using SSIS to load binary large objects, or blobs. I was doing a gig for a large retailer, moving data for a new in-store application. Part of the data was product images, mostly 3kb-5kb files. During testing, I noticed the load ran much faster if I did not load the images. I wondered why, so I searched the interwebz for an answer.

I did not find an answer. I found lots of comments and posts telling me I could not, in fact, use SSIS (2005) to integrate blob data. At least not the way I was doing it. My response was, “Huh,” as I was already using SSIS to (apparently) do the impossible. I knew right away this represented an opportunity. I learned everything I could about blobs and now, just nine short years later, I’m here to sharea.

When I searched the interwebz this time, I found an excellent blog post by my friend, John Welch (blog | @john_welch), titled Importing Files Using SSIS. John’s idea is straightforward and very “SSIS-y,” as Kevin Hazzard (blog | @KevinHazzard) says. With John’s permission, I am modeling the example for this post on John’s post.

How Does SQL Server Store Blobs?

Before we dive into a demo project, it’s important to know more about how SQL Server stores blob data. The short answer is: it depends. I can hear you asking, “What does it depend on, Andy?” It depends on the size of the blob data. For larger binary large objects, a pointer to a file location is stored in the row. When the row is read, the pointer points to the file location containing the binary data, and the binary data is streamed to the output.

In this example, we’ll take a look at how we use SSIS to move data from the file system into a SQL Server table. I changed a few things but, again, this example was inspired by John Welch’s post titled Importing Files Using SSIS.

Part 1 – Import Column

The Import Column transformation streams file binaries – the contents of a file – into a binary large object (Blob) “column” in a Data Flow Path. From the Data Flow path, these data can be streamed into a database table Blob field. Let’s demonstrate:

In a default instance of SQL Server 2014, I created a database named ImportPics. Then I created a table named PicFile using this statement:

CREATE TABLE PicFile
(
ID INT IDENTITY(1,1)
,
FilePath VARCHAR(255)
,
FileContent IMAGE
)

Next I created an SSIS 2014 project named ImportPicFiles and renamed Package.dtsx to ImportPicFiles.dtsx. I added a Data Flow Task and created a package parameter named ImportFilesDir to hold the path to a directory filled with Snagit screenshots:

notes 70 1 SQL SERVER   SSIS and How to Load Binary Large Objects, or Blobs   Notes from the Field #080

I add a script component as a Source adapter, then configure it to consume the $Package::ImportFilesDir package parameter as a ReadOnlyVariable:

notes 70 2 SQL SERVER   SSIS and How to Load Binary Large Objects, or Blobs   Notes from the Field #080

I add an output column named filename (DT_STR, 255):

notes 70 3 SQL SERVER   SSIS and How to Load Binary Large Objects, or Blobs   Notes from the Field #080

In the Script Editor, I add the following code:

using System;
using System.Data;
using Microsoft.SqlServer.Dts.Pipeline.Wrapper;
using Microsoft.SqlServer.Dts.Runtime.Wrapper;
using System.IO;
[Microsoft.SqlServer.Dts.Pipeline.SSISScriptComponentEntryPointAttribute]
public class ScriptMain : UserComponent
{
public override void CreateNewOutputRows()
{
DirectoryInfo dir = new DirectoryInfo(Variables.ImportFilesDir.ToString());
foreach (var file in dir.GetFiles())
{
Output0Buffer.AddRow();
Output0Buffer.fileName = file.FullName;
}
Output0Buffer.SetEndOfRowset();
}
}

(Yes, that’s C#. I’m learning new things. Yay me! :{>)

Next, I add and configure an Import Columns transformation. On the Input Columns page, I select the fileName field:
notes 70 4 SQL SERVER   SSIS and How to Load Binary Large Objects, or Blobs   Notes from the Field #080

On the Input and Output Properties tab, I expand the Import Column Output node of the treeview, select Output Columns, and click the Add Column button. I name the column “FileContents” and set the DataType property to DT_IMAGE:

notes 70 5 SQL SERVER   SSIS and How to Load Binary Large Objects, or Blobs   Notes from the Field #080

This next part is a little tricky. You can learn more about configuring this – and other tricky SSIS transformation properties – here.

Select the LineageID property value for the FileContents column and copy it to the clipboard:
notes 70 6 SQL SERVER   SSIS and How to Load Binary Large Objects, or Blobs   Notes from the Field #080

Next, expand the Import Column Input treeview node, then expand Input Columns, and then select the fileName column. In the FileDataColumnID property, paste the value you just copied to the clipboard:

notes 70 7 SQL SERVER   SSIS and How to Load Binary Large Objects, or Blobs   Notes from the Field #080

Add an OLE DB Destination adapter, connect it to the database where you created the dbo.PicFile table earlier, and configure the OLE DB Destination adapter to load dbo.PicFile:

notes 70 8 SQL SERVER   SSIS and How to Load Binary Large Objects, or Blobs   Notes from the Field #080

A successful test execution of the data flow task will appear as shown:

notes 70 9 SQL SERVER   SSIS and How to Load Binary Large Objects, or Blobs   Notes from the Field #080

The Import Column transformation is a powerful to load files into database blob columns.

Read SSIS and Blobs, Part 2 to learn even more.

If you want to get started with SSIS with the help of experts, read more over at Fix Your SQL Server.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – PowerShell Script – When Was SQL Server Last Restarted?

I have always wanted to learn a bit of scripting and I was always searching for a good reason to learn something new. As I said, this is my learning and I am open to learning some scripting tricks from you too. So do let me know how the below script can be optimized in your comments. Now, what was my reason to learn this script?

Someone asked me, “How can I find out when SQL Server was started?” There are multiple ways to find out, but I took the simple route and said – “Why don’t you check the SQL Server Error logs?” I know, the SQL Server error logs can get recycled and this data maynot be available. But in most cases, this gives us some indication.

So the other question was how can we script this requirement? So the below script is written to read SQL Server ErrorLog, find a specific text and report the number of times the string was found. We also show the time when the last/recent occurrence was found.

param
 (
 $serverinstance = ".", #Change this to your instance name
 $stringToSearchFor = "Starting up database ''master''"
 )
 # Load Assemblies
 [Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.Smo') | Out-Null
 # Server object
 $server = New-Object -TypeName "Microsoft.SqlServer.Management.Smo.Server" -ArgumentList $serverinstance
 $searchCriteria = "Text like '%{0}%'" -f $stringToSearchFor
 $SQLErrorLogs = $Server.EnumErrorLogs()
 $SearchEntry = @()
 ForEach ($SQLErrorLog in $SQLErrorLogs)
 {
 $SearchResult = $server.ReadErrorLog($SQLErrorLog.ArchiveNo).select($searchCriteria) | Select-Object -Property LogDate, Text
 $SearchEntry = $SearchEntry + $searchResult
 }
 $MeasureOccurences = $SearchEntry | Measure-Object -Property LogDate -Minimum -Maximum
 $SQLSearchInfo = New-Object psobject -Property @{
 SearchText = $stringToSearchFor
 Occurances = $MeasureOccurences.Count
 MinDateOccurred = $MeasureOccurences.Minimum
 MaxDateOccurred = $MeasureOccurences.Maximum
 }
 Write-Output $SQLSearchInfo | FT -AutoSize

Do let me know if you ever used such scripting capability to search your errorlogs? You can change the search string and the default instance to your environment.
I wrote this generically because I was planning to use this to find error messages inside ErrorLogs like “I/O requests taking longer than 15 seconds to complete”. The script can be modified to your needs. Do share your scripts too over the comments so that others can benefit from the sharing.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Identify Page Splits Using Extended Events in SQL Server

Digging some of my older posts reveal I have written a blog already on Page Splits with SQL Server. This was no surprise because I am in constant need to reinvent as I write the blog posts. Now you might ask, why was I searching. Well, when I was talking about a performance tuning topic at a conference, I told the harmful effects of page splits and one of the attendees asked – “How can we track Page splits inside SQL Server?”.

The easiest way to track if page splits are happening in SQL Server is to use the PerfMon Counters. For the records you can start from- “SQL Server: Access Methods -> Page Splits/sec”. This counter is quite handy to understand if this behavior happens in our SQL Server instance. Now there was an interesting counter question someone asked, “Is there a way to know each of the page splits that happen in the system?”

This question had got me thinking and I wanted to somehow find how this can be found. And to my surprise, I found this was already available with us all along with Extended Events. So here is what I did.

TSQL Script

Let us first create a database for our experiment. We will also create our Extended Event to track the PageSplits.

-- Create a dummy database
CREATE DATABASE PageSplits
GO
-- Get the DB ID, we will use it in next step
SELECT DB_ID('PageSplits')
GO
-- Create Session
CREATE EVENT SESSION [Catch_PageSplit] ON SERVER
ADD EVENT sqlserver.page_split(
WHERE ([package0].[equal_uint64]([database_id],(10))))  -- NOTE: Please make sure you change the DATABASE_ID
ADD TARGET package0.ring_buffer
WITH (MAX_MEMORY=4096 KB,EVENT_RETENTION_MODE=ALLOW_SINGLE_EVENT_LOSS,
MAX_DISPATCH_LATENCY=30 SECONDS,MAX_EVENT_SIZE=0 KB,
MEMORY_PARTITION_MODE=NONE,TRACK_CAUSALITY=OFF)
GO
-- Start the event session
ALTER EVENT SESSION [Catch_PageSplit] ON SERVER
STATE
= start;
GO

Before we get to the next step, let us start monitoring our Extended Events for any possible page splits.

USE PageSplits
GO
CREATE TABLE mytest (
myCompany CHAR(5) NOT NULL CONSTRAINT pk_mytest PRIMARY KEY CLUSTERED,
FillData VARCHAR(3000) NOT NULL
)
GO
INSERT mytest ( myCompany, FillData )
VALUES( '00001', REPLICATE( 'A', 3000 ) ),
(
'00002', REPLICATE( 'B', 1000 ) ),
(
'00003', REPLICATE( 'C', 3000 ) ),
(
'00004', REPLICATE( 'A', 3000 ) ),
(
'00005', REPLICATE( 'B', 1000 ) ),
(
'00006', REPLICATE( 'C', 3000 ) ),
(
'00007', REPLICATE( 'A', 3000 ) ),
(
'00008', REPLICATE( 'B', 1000 ) ),
(
'00009', REPLICATE( 'C', 3000 ) )
GO

Next step is to create a table with some values. Later we will use this to create a page split scenario.

page splits xEvent 01 SQL SERVER   Identify Page Splits Using Extended Events in SQL Server

Let us create the scenario of page split by updating a row with some extra data.

-- Update to introduce a page split
UPDATE mytest
SET FillData = REPLICATE( 'B', 3000)
WHERE myCompany = '00002'
GO

Don’t forget to look at the Live Data feed for entries. If the Page Split happen you will see something like:

page splits xEvent 02 SQL SERVER   Identify Page Splits Using Extended Events in SQL Server

As you can see, this Page Split has happened because of an Update. The two pages under question are also shown. We can also see the database ID under question that caused the page split.

I personally thought this was a wonderful concept hidden between tons of features of Extended Events. This is awesome to learn these fine prints.

Now that we learnt something new here, let us clean up the database we just created.

-- Clean up time
USE MASTER
GO
DROP DATABASE PageSplits
GO
DROP EVENT SESSION [Catch_PageSplit] ON SERVER
GO

Hope you found it interesting and do let me know how you were able to use Extended Events to learn something new inside SQL Server.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Introduction to Database Engine Tuning Advisor (a.k.a. DTA)

While talking to one of the attendees after my session at GIDS, I realized that I don’t have much information available on my blog about the usage of one of the fantastic performance tuning tool called Database Engine Tuning Advisor. This is also called as Database Tuning Advisor or DTA in short.

The good thing about this tool is that it is part of the product itself. Once SQL Server Client Tools are installed, DTA is installed along with that. This tool has the capability to suggest index and statistics recommendations for a query given as input. You need not be an expert about query optimization to use this tool.

There are various ways to provide workload input. For a quick demonstration, I would use sample database “AdventureWorksDW2012” and tune a query. If you are not aware about this sample database, then you can read the information provided in my earlier blog.

SQL Authority News – Download and Install Adventure Works 2014 Sample Databases

Once AdventureWorksDW2012 is restored, let’s assume that we need to tune below query.

SELECT [ProductKey]
,[DateKey]
,[MovementDate]
,[UnitCost]
,[UnitsIn]
,[UnitsOut]
,[UnitsBalance]
FROM [AdventureWorksDW2012].[dbo].[FactProductInventory]
WHERE [MovementDate] =  '2005-07-06'

 

There are multiple ways, we can provide input to DTA tool. The simplest way is to use management studio query windows and right click over there to choose “Analyze Query in Database Engine Tuning Advisor” as shown below.

DTA 01 SQL SERVER   Introduction to Database Engine Tuning Advisor (a.k.a. DTA)

Once that is clicked, the tool would open like below.

DTA 02 SQL SERVER   Introduction to Database Engine Tuning Advisor (a.k.a. DTA)

We need to choose the proper database in “Database for workload analysis:” and “Select databases and tables to tune”. In our example, we are using AdventureWorksDW2012 database so we will choose that in both the places. Once the database is selected we can directly use “Start Analysis” under “Actions” menu or press the button as well.

DTA 03 SQL SERVER   Introduction to Database Engine Tuning Advisor (a.k.a. DTA)

It would take some time to perform the analysis

DTA 04 SQL SERVER   Introduction to Database Engine Tuning Advisor (a.k.a. DTA)

…and finally provide recommendation(s).

DTA 05 SQL SERVER   Introduction to Database Engine Tuning Advisor (a.k.a. DTA)

One of the interesting piece which is highlighted in above image is “Estimated Improvements”. This the example which we have selected, we are seeing 99% improvement possible. The recommendation provided is to “create” an index.

To get that improvement, we can get the recommendations from Menu “Action” and choose either “Apply Recommendations” or “Save Recommendations”

DTA 06 SQL SERVER   Introduction to Database Engine Tuning Advisor (a.k.a. DTA)

Here is the recommendations file which I have saved and opened in SSMS.

DTA 07 SQL SERVER   Introduction to Database Engine Tuning Advisor (a.k.a. DTA)

Here is the comparison of query plan before and after recommendations.

Plan before Index

DTA 08 SQL SERVER   Introduction to Database Engine Tuning Advisor (a.k.a. DTA)

And here is the plan after index. We can see that clustered index scan has changed to non-clustered index seek.

 DTA 09 SQL SERVER   Introduction to Database Engine Tuning Advisor (a.k.a. DTA)

In future blogs, I would show some more example and ways to tune the workload by using database engine tuning advisor.

Reference: Pinal Dave (http://blog.sqlauthority.com)