SQL SERVER – Getting to understand SQL Server Activity Monitor in SSMS

This site is flooded with blog posts around SQL Server Management Studio’s Activity Monitor capability. In fact, we have written few special case scenarios on how to use the same. In this blog post, I might not want to rehash the same, but take a stance on how I explained using Activity monitor with one of my DBA friends in a recent community event. I felt it was important to capture the notes and have it here as a reference for future use. I am sure you will also find it useful and share your experience.

activity monitor 001 SQL SERVER   Getting to understand SQL Server Activity Monitor in SSMS

So what are some of the lesser used building blocks within Activity Monitor?

Resource Waits

This section provides a summary of what types of waits are accruing at the fastest rate. Each wait type is categorized into a wait category. At each refresh interval the tool saves the current total wait count and time (sys.dm_os_wait_stats) and any pending waits (sys.dm_exec_requests) into a temporary table #am_wait_stats. At the next refresh, the difference in number of waits and amount of wait time between the current and previous snapshot is calculated and displayed in this grid.

The fact that this includes waiting requests allows you to see “in-progress” waits rather than seeing spikes only after the wait completes. If you have a situation where one spid is blocked waiting on a lock for the entire refresh interval time you will see a wait rate of 1000 ms/sec on the Lock wait category. If two spids are blocked you’ll see a rate of 2000 ms/sec and so forth.

Note that query which encounter a lot of waits may not rank high enough to show up in the “Recent Expensive Queries”. Since these queries are waiting they do not consume CPU and may not perform logical reads/writes at a fast enough rate to fall into the Top N queries reported in that grid. Using the blocking scenario described above, the query will be consuming no CPU or any reads/writes and almost never shows up as an expensive query while it was running.

If you have a short refresh interval (e.g., 1 second) and a wait that is short the wait time column will show a brief spike in wait time which will drop back to zero on the next refresh.

IMPORTANT: This section shows CPU as a wait, but it only includes SOS_SCHEDULER_YIELD and signal wait time. It does not include consumed CPU like the CPU wait category in the management data warehouse reports.

Data File I/O

This section shows which database and files are responsible for the most physical IO. You can see the databases with the most physical reads or writes (in megabytes per second), and average response time per IO during the interval. The underlying data for this section comes from sys.dm_os_virtual_file_stats and thus includes both data and log files.

The underlying DMV clears its data when a database is taken offline. Since this section shows IO rates (vs total IO) you should still be able to compare metrics so long as the database isn’t taken offline/online at the same kind of interval as the Activity Monitor refresh interval. With a long refresh interval and autoclose databases you may see rates that are inaccurate – so be careful with this scenario.

Recent Expensive Queries

This section of the report shows queries that are currently running or have run in the last four hours which have significant resource consumption. The chart displays the top N (currently N=20) queries by any of the following metrics: CPU consumption rate, physical reads rate, logical writes rate, logical reads rate, and average duration.

This collector takes the delta in values (CPU consumption, reads, writes, …) between the current collection and a previous collection of data X seconds ago and divides this difference by the refresh interval to compute a rate of resource consumption. To find the queries that are using the most CPU you order by CPU ms/sec; the queries doing the most IO you order by reads/sec, etc.

The underlying queries from the DMVs are grouped by query_hash and query_plan_hash (query and plan fingerprints). Grouping on these columns allows Activity Monitor to find similar, non-parameterized queries that collectively consumed significant resources.

Hope this gives a reentry to understanding Activity monitor better again. Do let me know your thoughts on how you used the same in your environments.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Save and Send Execution Plan Via Email

Here is an interesting question, I received the other day.

Hi Pinal,

I have a very big graphical execution plan and I want to send it to my friend. My first thought was to send him query to execute on his own machine, but he does not have the same database which I am using. I also do not want to send my large database to him.

How can I save the execution plan to my computer and send it to him via email? Additionally, my email support only maximum of 20 MB, so please also guide me, what should I do if the size goes over 20 MB?

Thanks!

It is indeed a very interesting question. Here is the simplest way to do what the user is asking in the email.

Here is a quick tutorial.

Step 1: You can write any query and click on the icon on the top tool bar or type shortcut CTRL+M to enable actual management studio.

exec1 SQL SERVER   Save and Send Execution Plan Via Email

Step 2: Click on Execute Query

exec2 SQL SERVER   Save and Send Execution Plan Via Email

Step 3: It will execute the query and will bring up Execution Plan Tab. Click on the Execution Plan Tab.

exec3 SQL SERVER   Save and Send Execution Plan Via Email

Step 4: Upon clicking on Execution Plan it will bring up a query execution plan. Right Click over here and it will bring up a menu. In this menu click on the Save Execution Plan As…

exec4 SQL SERVER   Save and Send Execution Plan Via Email

Step 5: You can save your file with the extension.sqlplan.

exec5 SQL SERVER   Save and Send Execution Plan Via Email

That’s it! You are done. Now you can send the sqlplan via email to anyone. When someone opens the file, it will open in SQL Server Management Studio as displayed in the following image. The user has no need to have any database on their system. They can just see the data fine in the sqlplan.

exec6 SQL SERVER   Save and Send Execution Plan Via Email

You can open the sqlplan file in the notepad and you will see xml details for the plan.

exec7 SQL SERVER   Save and Send Execution Plan Via Email

Reference : Pinal Dave (http://blog.SQLAuthority.com)

SQL SERVER – Trick – Running SSMS With Different Windows Account

Recently in one of my interaction with a DBA inside a big financial organization, I was surprised to see that the administrator was logging into a desktop / server using a different credential. What baffled me was the fact that he was logging off. I immediately asked, why are you doing this? He said, since he was using windows authentication, which was different from his normal credentials that for DB related activities he used to create a different session. After a bit of research, I found this to be a common practice in a number of organizations.

Many companies provide two different windows account to any Administrator. One “regular” account (example SQLAuthority\Pinal) is used for general work like emails, login to laptop etc. And another “admin” account (example SQLAuthority\adm_pinal) for administrative level tasks to be done on the server. The regular account would have no access to SQL Servers, but the admin – level account would. To make security stronger, the companies use a Windows account to connect to SQL Server.

This makes thing little interesting. If one has to connect to SQL Server from the laptop where the DBA has logged in with regular account, he/she has to do “Right Click” with Shift key pressed and then use “Run as Different user” to use the admin account.

ssms runas 01 SQL SERVER   Trick   Running SSMS With Different Windows Account

Pressing shift key is important otherwise we would not see that option. After choosing that, it would pop-up with user name and password window. If you notice, you would notice ssms.exe path which is used.

ssms runas 02 SQL SERVER   Trick   Running SSMS With Different Windows Account

Another way, which I prefer is using Run as command. The program runas.exe allows to let us tell Windows to run a program using a different user’s current network environment instead of the local environment. The full details and switches of the run as the program can be found in TechNet article.

I would normally create shortcut on desktop. Right Click and choose a new shortcut as shown below

ssms runas 03 SQL SERVER   Trick   Running SSMS With Different Windows Account

and give location/parameter as below

C:\Windows\System32\runas.exe /noprofile /env /user:SQLAuthority\adm_Pinal “C:\Program Files (x86)\Microsoft SQL Server\130\Tools\Binn\ManagementStudio\Ssms.exe”

You need to change two things, user name and the correct path for SSMS.exe

Once it’s saved, you can double click and provide password on the command prompt. I have given a shortcut name as Admin SSMS in below screenshot. As soon as I double click, I see below

One credential is provided, SSMS would open with that windows account.

I must conclude by saying that if you are using SQL Authentication then you do not need to do this because Windows credentials are not passed to SQL Server.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – SSMS Query Command(s) completed successfully without ANY Results – Part 2

What do you do when you encounter something that you didn’t expect? A lot of times such experiences take you to a state of shock or a state of bafflement. The curiosity sometimes pushes you off the cliff to understand why certain behaviors occur. I am fortunate that I get into bizarre situations and the learning it gives is immense. No wonder such things happen because I play around with SQL Server Management options as part of learning. This learning exercise gets me into trouble and below is a classic case.

From my laptop, whatever query I run, I just receive a message like Query Command(s) completed successfully without any result. Here is an example:

no exec 01 SQL SERVER   SSMS Query Command(s) completed successfully without ANY Results   Part 2

Long back I have written below blog post with the same subject.

SQL SERVER – SSMS Query Command(s) completed successfully without ANY Results

The first stop would be to search the blog for possible replies that might help. Later, I realized that there were few comments where readers mentioned that even after following the steps given in the above blog (SET PARSEONLY option), they were still seeing the same behavior. Whenever I get into such stage, I ask folks who might know. So, I was asked I can capture a profiler to reproduce the issue.

Below is what I shared with my friends to walk the same learning:

no exec 02 SQL SERVER   SSMS Query Command(s) completed successfully without ANY Results   Part 2

In the profiler, we have the answer. It’s been due to the fact that NOEXEC option, it SET to ON. The next question was to identify a possible reason and how I landed into this in the first place. There are two places we can set it in Management Studio.

Query Level (for current query window in SSMS):

In SQL Server Management Studio, right click on query window and choose “Query options”

no exec 03 SQL SERVER   SSMS Query Command(s) completed successfully without ANY Results   Part 2

And over there, we can set option for query level.

no exec 04 SQL SERVER   SSMS Query Command(s) completed successfully without ANY Results   Part 2

SSMS Level (For all query windows):

In SSMS, we can go to “Tools” > “Options” and chose Advanced as shown below.

no exec 05 SQL SERVER   SSMS Query Command(s) completed successfully without ANY Results   Part 2

If any the option is checked, we would see the same behavior which I explained at the beginning of the blog. So, if you see such behavior, check PARSEONLY and NOEXEC set options in query level setting and SSMS level setting.

Have you ever landed into this behavior ever? Do let me know. I am sure if you share via comments, it will get us into an exploratory mode to learn and try more options if I missed any.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Increase Characters Displayed in Text Results

Working with various SQL Server options is always a great feeling. Every now and then I go to the Options page to get just the thing I want. When I wrote the blog post – SQL SERVER – Getting started and creating JSON using SQL Server 2016 I was expecting interesting conversations about JSON use. On the contrary, guess what – I got an email from a junior explorer of SQL Server 2016 CTP is stating that he was not getting the same output that I had shown in my blog post. I was surprised by this statement.

I asked the mailer to send me a screen shot or an example of what he was seeing. Lesser did I know the context because this was exactly what I had got and I had changed something for the blog post.

Using Query Options

Here are the steps I did while I was writing the previous blog post.

Right click in the query editor and select Query Options. Under Results, select Text. Uncheck “Include column headers in the result set” and change the maximum number of characters displayed to 8192.

Click on in the editor and click Results To and choose Results to Text and click OK.

increase max number character 01 SQL SERVER   Increase Characters Displayed in Text Results

After making the above change, I was able to get upto 8192 characters as part of output to Text.

This is one the simplest option that I thought was easy to find and yet had to share them with couple of readers. Since I was repeating the same, I thought it was worth to write it as a blog post for you.

Note: The default maximum number of characters displayed in each column is 256. The maximum allowed value is 8192.

Do let me know if you have ever changed this option every in your environments? What value did you use whenever you had to change? Let me know.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Generating Meaningful Test Data with dbForge Data Generator for SQL Server

Recently I faced with necessity of generating meaningful SQL data for testing purposes. During my search for a proper application for the task, I found out that Devart, a recognized developer of database solutions, released dbForge Data Generator for SQL Server, a GUI tool for fast generation of meaningful SQL data for development and testing purposes. Devart stated that with the tool, I will be able to:

  • Populate SQL Server tables with millions of rows of test data
  • Generate meaningful data that looks just like real data
  • Use 100+ predefined generators with sensible configuration options
  • Customize built-in generators
  • Create user-defined generators
  • Preview generation results
  • Save data population script for future reference
  • Automate data generation with the command-line interface

It sounded great, so I decided to give a quick test for the tool. Moreover, Devart offers a 30-day free trial for the most of its products, including Data Generator for SQL Server.

The first thing I noticed after running SQL Data Generator was clear user-friendly interface. The start page of the tool includes quick access to the main features and allows to quickly run recent projects, if any.

dbforge 1 SQL SERVER   Generating Meaningful Test Data with dbForge Data Generator for SQL Server

After running new data generation, I was prompted to select connection and database as well as to set default data generation options. I decided to generate meaningful data for few columns of the AdventrureWokrsTest database.

dbforge 2 SQL SERVER   Generating Meaningful Test Data with dbForge Data Generator for SQL Server

On the Options page, I selected to generate 2000 rows, and cleared the Include Null values option.

dbforge 3 SQL SERVER   Generating Meaningful Test Data with dbForge Data Generator for SQL Server

After that, Data Generator showed the generation document. I selected the following columns of the Person.Contact table for data population:

  • ContactID
  • Title
  • First Name
  • LatName
  • EmailAddress

Now, the time has come for selecting generators for each column. In dbForge Data Generator for SQL Server, all generators are split into several categories for easier look-up.

dbforge 4 SQL SERVER   Generating Meaningful Test Data with dbForge Data Generator for SQL Server

I was really impressed with the collection of generators the application offers. The tool automatically detects and offers appropriate generators for a given column.  Also, Data Generator for SQL Server includes multiple ways to make generated data more customized. For instance, you can:

  • Set the range of generated values for different SQL data types
  • Define the percentage of NULLs to be generated
  • Set the value distribution mode
  • Immediately preview data that will be generated

Data Generator suggested the following generators:

  • IDs for the ContactID Additionally, I selected the Random by timestamp value distribution mode for getting more complex numbers.
  • Honirific Prefix (Title) for the Title column
  • First Name (male) for the FirstName column
  • Last Name for the LastName column
  • Email for the EmailAddress This generator is based on the following regular expression \c{3,8}(\d{0,3}|.\c{3,10})@nowhere.com that can be customized right in the Fill settings section of Column generation settings. For instance, you can modify the regular expression, so that the domain name would be “mailservice.com” instead of “nowhere.com”.

dbforge 5 SQL SERVER   Generating Meaningful Test Data with dbForge Data Generator for SQL Server

Data Generator also allows to modify existing generators and create your own ones, so in future you won’t need to modify the same generator over and over again. I opened the Email.xml file, that is located at C:\Users\Public\Documents\Devart\dbForge Data Generator for SQL Server\Data Generators,  modified regular expression so that the domain name would consist of combination of letters, updated description, name, and saved the file as MyEmail.xml:

dbforge 6 SQL SERVER   Generating Meaningful Test Data with dbForge Data Generator for SQL Server

The following screenshot shows the data generation document with the result of my manipulations:

I was quite satisfied with the results showed in Preview, so I was ready to populate the target database with data. Data Generator offers several ways to do that, namely:

  • Open the data population script in the internal editor
  • Save the SQL script to a file
  • Run the script automatically

dbforge 7 SQL SERVER   Generating Meaningful Test Data with dbForge Data Generator for SQL Server

I chose the first option and in few seconds, I got the following population script:

dbforge 8 SQL SERVER   Generating Meaningful Test Data with dbForge Data Generator for SQL Server

To sum up, I would like to say that the tool has done a fantastic job for me. In my opinion, the most strong points of Devart’s Data Generator are:

  1. Simple work-flow in a convenient interface. With dbForge Data Generator, the process takes just couple of minutes and basically, it is simplified to three simple steps:
  • Selecting required tables
  • Selecting and customizing a required generator
  • Populating tables with test data
  1. Impressive collection of generators. The tool includes over 100 generators that allow to populate tables with realistic-looking test data.
  2. Flexible Customization Options. Developers put really much effort for making data more customized and suiting specific needs.

I strongly recommend to give dbForge Data Generator for SQL Server a try and to evaluate its functionality on your own by downloading a 30-day free trial.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Finding Tables Created Last Week – DBA Tip

My conversations with my DBA friends are always interesting. They have unique ways to solve problems and organizations take them for granted. Most of the DBA’s want to be in control of their environment. There is no second thought on it when it comes to delivery from their side.

In a recent conversation, one of my friend asked me if there was a way to identify the tables that were created in the database in the past one week. He was asked by his security team to bring a report on when an object is created in the production server. They want to make sure no unnecessary objects are getting created in the database without prior approval.

Though my thought process was to build an audit for this, I thought of sending a simple script that can be used to identify the same.

SELECT o.name AS [Object_Name],
s.name [Schema_Name],
o.type_desc [Description],
o.create_date [Creation_Date],
o.modify_date [Modified_Date]
FROM   sys.all_objects o
LEFT OUTER JOIN sys.schemas s
ON o.schema_id = s.schema_id
WHERE  create_date > (GETDATE() - 7) OR modify_date > (GETDATE() - 7)

A sample output is shown below:

table creates last week 01 SQL SERVER   Finding Tables Created Last Week   DBA Tip

As you can see, the query can be added to a SQL Server job and we can send the same as an email to the DBA or security team automatically. The other option I gave my friend was to run an SSRS report and schedule a report output via email to the Security team.

If you are person not doing such hi-fi stuff of sending reports, then there can be a much easier way to find this information. We can use the “Schema Change History” report available inside SQL Server Management Studio to get this information.

table creates last week 02 SQL SERVER   Finding Tables Created Last Week   DBA Tip

A typical output is shown above. This is similar to the query output we discussed before but in a built-in report that is out-of-box. Most of these SQL Server Management Studio tips are simple and are hidden in some shape or form.

Do let me know if you every have had the need for such requirements? What did you do in that case? Would love to hear your experience and implementation for sure.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Filter In-Memory OLTP Tables in SSMS

Earlier I had written a blog about SQL SERVER – Beginning In-Memory OLTP with Sample Example which covers the basics of working with In-Memory OLTP. Though that post gets you started, one of my colleague asked me if there was an easier way to identify In-Memory Tables when working with SQL Server Management Studio. Also my friend said, their database had 1000’s of tables and it will be impossible to look at any icon difference or something that is tough to see.

This statement got me thinking and I wanted to give him few options. On exploring SSMS, you will be surprised all these exist right in front of our eyes.

T-SQL Way

The simplest solution here was to use the standard system table and query for just the In-Memory tables. The below query will output only In-Memory tables present in the selected database where this query is being run.

SELECT * FROM sys.tables
WHERE is_memory_optimized = 1

Though these age old methods exits, these still don’t help us much when it comes to visualizing the same in SSMS.

Using Object Explorer

Method 2 is using the Object explorer -> Filter settings. In SQL Server Management Studio of SQL Server 2014, we get a nice little option to filter based on In-Memory tables. Right click the Tables node and select Filter Settings. Now in this Dialog we have additional settings that are useful.

IMO 01 SQL SERVER   Filter In Memory OLTP Tables in SSMS

Using Object Explorer Details

The 3rd and simple way is to use the Object Explorer Details (F7) inside SQL Server 2014 SSMS. On the header, right click to add “Memory Optimized” Column. This will show True or False values. Sort on this column and we get all the In-Memory Optimized tables at the top.

IMO 02 SQL SERVER   Filter In Memory OLTP Tables in SSMS

I thought these were simple yet powerful tip when working with these special tables inside SQL Server 2014. Do let me know if you have any other method other than this to filter out in-Memory Optimized tables inside SSMS.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Customization of New Query in SQL Server Management Studio

helpbowl SQL SERVER   Customization of New Query in SQL Server Management StudioRecently I was working with one of my friends who handles production server databases as a senior DBA. He told that recently one of the Jr. DBA was fired from his company because he has run a drop table command on a production server. He was scared and from that time onwards, every DBA has been asked to run select @@servername commands before executing any query on any server. Similar to this situation, many times a DBA wants a set of command, which they use very frequently, should be available as soon as SQL Server Management Studio is opened.

Both of the above situation can be handled by the trick explained in this blog.

Whenever we launch SQL Server Management Studio, it loads “SQLFile.sql” from the operating system and shows the text in the New Query Window. The location of this file depends on location of SSMS.exe and version of SQL Server. My machine has SQL Server 2012 and SQL Server 2014.

Here is the location of SSMS.exe on my 64 bit machine for SQL 2012

E:\Program Files (x86)\Microsoft SQL Server\110\Tools\Binn\ManagementStudio

ssms nq 01 SQL SERVER   Customization of New Query in SQL Server Management Studio

If you are running 32 bit operating system then it would be

<Drive>:\Program Files\Microsoft SQL Server\110\Tools\Binn\ManagementStudio

Here is the mapping of SQL versions and internal version numbers (which is 110 from SQL 2012 in the above example).

Microsoft SQL Server 2014 120
Microsoft SQL Server 2012 110
Microsoft SQL Server 2008 R2 100
Microsoft SQL Server 2008 100

Once you find SSMS.exe, then we can navigate to SqlWorkbenchProjectItems\sql folder as shown below.

ssms nq 02 SQL SERVER   Customization of New Query in SQL Server Management Studio

The file highlighted “SQLFile.sql” is the file which is loaded as a new query window.

On my machine, I have modified the file and written below

/*
select @@version
go
select @@servername
go
Select *
from   sys.dm_exec_requests
where  blocking_session_id <> 0
*/

After modifying, whenever you open new windows, it would be as below.

ssms nq 03 SQL SERVER   Customization of New Query in SQL Server Management Studio

As we can see that this is EXACTLY same text which we have written in the SQLFile.sql file.

Be Aware: This file is used by all user profiles on the machine. If you delete this file by mistake, you would get below error whenever a new query window is attempted.

ssms nq 04 SQL SERVER   Customization of New Query in SQL Server Management Studio

—————————
Microsoft SQL Server Management Studio
—————————
Cannot find template file for the new query (‘E:\Program Files (x86)\Microsoft SQL Server\120\Tools\Binn\ManagementStudio\SqlWorkbenchProjectItems\SQL\SQLFile.sql’).
—————————
OK
—————————

To fix this, you can create an empty file with the same path and name in the location as per error message. The path of the error message would vary based on installation on your machine.  Hope you found this tip on SSMS useful and do let me know if you will be using the same.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – SSMS Trick – Generating CSV file using Management Studio

helpink SQL SERVER   SSMS Trick   Generating CSV file using Management StudioIn my previous blog I talked about using SQLCMD to generate a comma separate files for the table data.

Some blog readers reported that why we should use SQLCMD or BCP to generate the file when the same can be achieved using SQL Server Management Studio itself. Well, that was a strong statement and I liked the simplicity of the solution. Here are the steps. A lot of times it is our inability to learn some of the simple techniques present in the tools that we use make us even more productive.

Here are the steps to achieve the same. Go to SQL Server Management Studio > Choose Tools from Menu Bar > Click on Options. Go to “Query Results” > “SQL Server” > Click on “Results to Text”

ssms csv 01 SQL SERVER   SSMS Trick   Generating CSV file using Management Studio

Once we choose “Comma delimited” and hit OK – we will get below kind of output when we eecure a query (Note that we need to open a new Query Window so that the settings are saved)

 ssms csv 02 SQL SERVER   SSMS Trick   Generating CSV file using Management Studio

Notice that first column is the column name. If we want to avoid that and wants ONLY the data, then we can customized that in the same place where we have modified the separator setting. If we uncheck “Include column headers in the result set” then we will get only the rows as output.

We can also choose custom delimiter. I have set delimiter as pipe (|) and unchecked the box which I mentioned earlier and below is the output for the same query.

ssms csv 03 SQL SERVER   SSMS Trick   Generating CSV file using Management Studio

As we can see that output is pipe separated and we just have data not column heading. This can be saved and imported again to some other server using bcp command, Bulk Insert command or Import/Export Wizard.

Recommended reading:

Reference: Pinal Dave (http://blog.sqlauthority.com)