SQL SERVER – 8 Performance Related Articles on Logical Reads

Earlier this year, I have written few articles which are related to SQL Server Performance and they are published on site of Logical Reads. Here are my personal favorite 8 various articles on the subject of SQL Server Performance. I have really enjoyed writing them and I am very confident that you will like that as well. Let me know which one is your favorite article by leaving note in the comment area.


How SQL Server DMVs Can Be a Savior in SQL Azure, Too

Be it SQL Server or the improved version of SQL Server on the cloud called as SQL Azure, when it comes to monitoring, there is nothing as powerful as DMVs (Dynamic Management Views). For a primer, Dynamic Management Views (DMVs) are a set of virtual views which expose the internal states of SQL Server deployment. […]

Troubleshooting High CPU issues in SQL Server (Part 2)

If you ever ask a SQL Server DBA about most frequently faced performance issue in SQL Server, they would most likely say blocking or high CPU. A smart DBA would first confirm that high CPU is caused by SQL Server by looking at performance monitor. Once it is confirmed who is consuming high amount of […]

SQL Server Memory Buffer Pools: Understand the Basics

How many times you have heard this question: “Why is SQL Server consuming so much memory?” If you ask a DBA who knows the functionality of SQL Server, the answer would be, “That is an expected behavior! Limit max server memory of SQL Server if you are concerned.” One might then wonder why SQL Server doesn’t release […]

SQL Server Performance Tip: In-Memory OLTP

SQL Server 2014 introduced the new database engine enhancement called In-Memory OLTP. This feature uses new data structures which are optimized for in-memory access of table. People normally call it In-Memory database. In reality, we would like to call it partially in-memory database because SQL Server allows us to have few hot tables in-memory and […]

Troubleshooting High CPU Issues in SQL Server (Part 1)

Performance troubleshooting is one of the unique challenges every developer and DBA has to face. One of the major complaints about performance troubleshooting is utilization of resources. Few organizations resort to putting muscle power to scale up the system without analyzing the root cause of the performance problem. For example, a typical scenario might involve a SQL Server instance with memory […]

Understanding Recompiling Statements with SQL Server

Recently, I heard a story that was quite compelling. A retired naval officer used to get to the top of a bridge and keep waving his hand everyday around 8 PM. The passer-by used to think the elderly man had gone nuts and, after awhile, ignored his actions. One fine day, a stranger saw this and was […]

Two Options for Query Optimization with SQL Server

Working with SQL Server is always a challenge. As developers try to troubleshoot SQL Server performance problems, the first step that take is to look at the queries. This is the most basic step and most important step for most developers. Developers love these challenges of optimization because they can get the maximum visible performance improvements in their […]

3 Reasons You Should Move to Cloud

Cloud is the next buzz word in the industry. If we were to quiz any CIO on what are some of the big bets they want to make in the next 6-12 months–using cloud will surely be one of them. Why has this new concept has caught on like forest fire? Why are organizations exploring […]

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Configuration and Performance of SQL Server is Now Easy to Master

This is one of the toughest topics to address, and I am always looking at new tools that can help me learn and understand SQL Server better. In many cases, I fall back to MSDN documentation and official statements from the Microsoft site because a lot of R&D generally has been done by them already. Though I learn about these recommendations, I make it a point to try them out in my limited capacity of testing. I am always of the opinion – “Accept but yet validate.” This reinforces our understanding and we will never forget the concepts thereafter.

Coming back on the topic, what is the tool that you use to check for configuration settings on your SQL Server box? In the past I have talked about SQL Server Management Studio Standard reports and a number of reports that can help. Today, let me tell you about a tool that has helped me learn some new things every now and then – it is called Microsoft SQL Server 2012 Best Practices Analyzer. As I said before, since it comes from Microsoft directly, I use the recommendations and try to learn from the tool. The documentation is crisp and neat too.

As the name suggests, it is a guideline to best practices recommended by Microsoft. This tool can help analyze the configuration setting of your SQL Server box for:

  • Analysis Services
  • Database Engine
  • Integration Services
  • Replication Settings
  • Reporting Services
  • Setup

Let me talk about some of the nice rules available for Database Engine. These are just a subset of 100’s of rules available in the tool. I am just calling out few of them here for reference:

  • It is a security best practice to know if there is a DDOS attack on the server by auditing Failed Logins. Failed Logins are enabled by default, but the rule makes sure there is a check in case it was disabled by the administrator.
  • I have not been a fan of auto-growth happening on the server. Auto-growth is a safety net but administrators need to make sure the files are grown ahead of time during the maintenance window. The rule checks if auto-growth failed for some reason or if it took longer than expected. The answer to this is enabling Instant File Initialization.
  • Placing data and log files on the same volume is never a great idea. It is recommended to place these on different physical drives.
  • If a database is not enabled with CHECKSUM, it will be difficult to identify IO related inconsistencies. Read more from the KB. The rule makes sure these are enabled at the DB level.
  • Having too many VLFs can be a problem. SQL Server 2012 does put some data into the Error Log, but this rules does a check and lets us know.
  • It is a security risk to grant more than read permissions on the BINN folder inside SQL Server. We don’t want the binaries to vanish or get corrupt by malicious users. So it is the case for the DATA folder where we are placing our system files.
  • If password policies are not enforced in the system, the rule raises the same as alert. It is a good practice to have password policies of expiration enabled on the server.

The number of rules can keep going on. As I discussed earlier, these are great learning resources for us. We can read on each of the rule and then do our R&D to learn more.

Monitoring performance using the Database Performance Analyzer

Moving now from standard configuration settings to the performance topic. There is no single starting point when it comes to troubleshooting performance inside any database. When it comes to performance tuning, you need to think differently. With complex systems built on top of databases, you need something that is built radically different. SQL Server did introduce a concept called Wait Stats and it has been a popular mechanism for many to have a macro high level understanding of the various waits inside SQL Server. In fact I have written a book on Wait Stats based on the 28 days series on Wait Stats over my blog.

From what I have seen in Database Performance Analyzer from SolarWinds (you might also know it byt its former name Confio Ignite), it has enough and more reference to these Waits. Using Waits as an indicator to server health is something unique I saw from this tool. If you are a DBA and want to tune your systems from the inside out, then it is worth taking a note of this tool. Wait-based reports are embedded almost all over the place. Let me start by looking at the home dashboard. It is crisp and to-the-point for high level information like CPU, Memory, Disk, Waits. If there is something to be worried about, then a warning symbol indicates areas of concern. We will look at some of them as we start exploring the tool.

From the main dashboard, if we select the server that we are monitoring, we will be presented with the above screen. It is a great way to look at the trends of various statements and the waits happening inside the system. What surprised me is the way the waits are also tracked for Oracle, Sybase, Oracle (RAC), DB2 servers from the same console. The enterprises of today are a heterogeneous environment and tools like these that give a single pane of view of waits are great tools.

The Trends graph shows the waits and maximum time taken by batch of queries. The Advisor pane at the bottom was the icing on the cake, it shows the query that consumed maximum CPU or Memory on the server. Clicking on the “more…” button we can view the query and take corrective measures on reducing the same. Let us look at one such recommendation below:

Here at the details pane, we can see the query that got executed and possible recommendations for a better plan are presented. In this case, we get the information about possible “Missing Indexes” for the given query. As a DBA, it is almost impossible to go to each query and find what the missing indexes are. Or worse, in SQL Server we do get a list of missing indexes but it fails to tell which queries will benefit. This screen is a classic example of both coming together. A nifty addition for sure. The bottom pane shows the various waits that affect the query or we can get a day view to see what the top waits are.

In this above view, I have taken an historical trend of a typical day to see what is causing possible waits inside our systems. The very fact that we have some sort of CPU contention is quite evident with the “Red” bar consistently.

Who doesn’t like a little help from the system? Above is a typical example where we see high WRITELOG waits and if a junior DBA is looking at the system, they will not know where to start. On clicking on each of the waits, we are presented with a simple steps of documentation to what one can do to mitigate these waits. Though these recommendations are given, I highly encourage you to consult your senior DBA before taking any steps.

The storage trends were a nice visualization for me. The snapshot shows the top IO users and orders them based on that. Here you can see the tempDB ldf file seems to be under contention. The data shows a sparkline trend of latency for read/writes.

DPA also has the information of standard performance counters grouped by resources. This is great basic information and for a seasoned DBA, this can be an awesome tool to view from a single interface.

These are standard performance counters from the comfort of a web browser so a DBA can watch what is happening inside the server. One can set thresholds for various grouping and also keep alerts. These in my opinion significantly make a DBA proactive rather than reactive when a problem occurs.

Conclusion

If you are looking for detailed performance analysis, but lack the time and/or knowledge to decipher Wait Stats and server resource metrics, this tool will not disappoint you.  It gives you pointers to work with the essential performance characteristics.  Do let me know your experience working with Database Performance Analyzer, I would surely like to learn your perspective using this tool.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Year 2014 Reflections by Tim Radney – Notes from the Field #060

[Notes from Pinal]: New Year’s Resolutions is a funny word. Everybody makes it on January 1st but breaks it in very few days. I used to make resolutions too, but for a while, I have stopped this practice. Tim Radney is my great friend and I asked him that what can replace resolutions for the new year. He immediately said – Reflections on Previous Years. I really like how promptly he answered me. If you think, it does make sense to look back and see how much we have accomplished in this one year. We can build the next year’s plan based on what we have done and what we should have done. I think it is a great way to start a new year. I requested him to if he can give any example of Reflections on Previous Years.

On this new year, I also would like to congratulate Tim for joining SQLskills and beginning a new chapter in his career. I will repeat one of the statements which I told my friend – Tim is not like my brother, he is my brother. 


With the beginning of every year, many of us reflect back on the previous year and also while doing so set goals or resolutions for the upcoming New Year. I am hopeful that for all of you reading this that you get to reflect back to a prosperous year full of good health and knowledge gain.

With me being fairly active on social media such as Facebook and Twitter, each year I see resolutions people make for themselves ranging from getting in better health, writing a book, reading more books, blogging more, learning a new skill or finding a better job. Whatever goals or resolutions you may have for yourself, I recommend establishing smaller milestones to help you measure your success of reaching your goal. I have found success by treating these goals as a bunch of smaller task. That makes them much more manageable.

The year 2014 has been very good for me. I have made many new friends in the SQL community, I was awarded SQL Server MVP from Microsoft, released my third book, spoke at numerous SQL conferences including the PASS Summit, attended the Microsoft MVP Summit, was award D&B MVP, ran a half marathon, and decided to change employers for the first time in 17 years.

None of those things would have been possible without the SQL community and without setting goals for myself. I am one LUCKY guy, but at the same time I work hard and seize opportunities that are given to me. I have been lucky and blessed that others have invested their time into me. Others have seen things in me that they could train, encourage and mentor and have helped push me to be become stronger in my skill. I encourage you all to do the same for others.

Let 2015 be the year that you find a mentor of your own if you don’t have one, and start mentoring others if you aren’t already. We have a saying among the SQL speakers. If you want to learn something really well, get up to speak about it. That same can be said about helping others grow. If you want to really grow in your career, help others grow in theirs as well. The rewards are endless.

I don’t know what all 2015 will hold for me, but I will continue to study, read and learn as much as possible. As I learn new tips and tricks I will be blogging and tweeting to share my finds. I encourage you to do the same and help continue to make our SQL community great.

A special thank you to my very good friend Pinal Dave for helping to make 2014 a stellar year for me. I value our friendship a great deal and it was great to finally meet the rest of your family in Seattle.

If you want me to take a look at your server and its settings, or if your server is facing any issue we can Fix Your SQL Server.

Note: Tim has also written an excellent book on SQL Backup and Recovery, a must have for everyone.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Performance Tuning – Is It Really A Top Skills for a SQL Server Consultant? – Notes from the Field #059

[Note from Pinal]: This is a 59th episode of Notes from the Field series. I get over 100 emails everyday and out of 100 emails nearly 90 of them are really related to performance tuning. The emails I get is from various different IT professionals. There are quite a few application developers like (Dotnet, Java, PHP etc) who do application development everyday, but they are not always expert of SQL Server. When I read those emails I feel that the world indeed need plenty of SQL Server Performance Tuning consultants. I asked these questions to SQL Server Guru Brian Moran. He is a unique blend of person who knows the consultant’s business from technology and business perspective.

Today is Christmas Day and we have received a wonderful gift from Brian!

In this episode of the Notes from the Field series database expert Brian Moran explains in very simple words the reality of the SQL Server Consultant world. Read the experience of Brian in his own words.


December and January are times of the year when many people reflect on personal and career goals. It would be silly and naïve for me to pretend that I know what’s best for your career so take this post with a grain of salt. Maybe you want to be a fireman or an astronaut? I don’t have many words of wisdom if that is your hope.

But, I do have a lot of knowledge for data professionals who aspire to be independent consultants.

I focus more on the business side of running my consulting business these days so I’m not as technical as I used to be. But I was a SQL MVP for many years and was a pretty good tuning architect. I’ve been on the PASS Board twice and have been involved in the SQL Server space for almost 25 years. Over the years I’ve been part of the leadership team that built two successful consulting companies and I’ve coached and mentored dozens of technologists over the past two decades as they got their start in consulting.

This post is for folks who love the engine side of being a database pro and want to be an independent consultant. I’m not talking to the BI folks or the other myriad of specialties that exist in #SQLFamily. But do you love the engine? I mean really love working with the core engine? Do you want to be a consultant one day?

World class and amazing performance tuning skills are consistently the on the short list of skills that I’ve seen successful independent SQL Server consultants have over the past 20 years. Are there other skills out there that allow a consultant to have a great career? Of course. You can do well with any specialty that is hard to master and that companies absolutely must have access to in order to be successful. HA/DR, consolidation, virtualization, security, and a wide variety of other topics come to mind. But here is one element of performance tuning that that tends to create opportunities for consultants to maximize their rates and have a successful career.

Top performance tuning consultants can have a pretty quick impact on a problem and companies often need access to the skill on short notice for critical needs. Sometimes tuners can solve a problem in minutes or hours. And long term performance tuning work is often measured in days or weeks rather than weeks or months that some other types of projects require.

Here is some basic math to think about. Let’s say you want to bill $250/hr. Is that expensive? Well, like any SQL question the answer is it depends. Let’s look at it this way.

  • Assume you can solve a critical problem for a customer in 20 hours. $250/hr for 20 hours is $5,000.
  • Imagine you are competing against someone who says they can solve the problem for $125/hr.
  • 50% less sounds like a big savings for the customer, right? But, it’s really only a net savings of $2,500.

Do you think many companies care about a difference of $2,500 when they desperately need to solve an urgent problem impacting critical business systems? Probably not. They probably care more about picking the person they believe is the best fit and most likely to solve the problem.

But now let’s imagine that you are talking to a potential customer about a project that might take 6 months to solve working full time. That’s about 1000 billable hours. Let’s make the same assumptions that you are charging $250/hr and someone else is going to offer $125.hr. Now the 50% differential is rate yields a net savings of $125,000.

Will many companies care about saving $125,000 if they can? Yep. They sure do.

I find that many folks never fully think through this side of the consulting equation early in their career. IE, within reason companies, mostly don’t care about the rate for performance tuners as long as you are awesome and can get the job done. Don’t get me wrong. It’s quite possible to charge premium rates for long term projects and you can certainly be a successful consultant while, focusing on some other aspect of SQL Server. But almost 25 years in the business have taught me that loving performance tuning and wanting to be an independent consultant can be a match made in heaven.

If you want to get started with performance tuning and database security with the help of experts, read more over at Fix Your SQL Server.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Proving that the Source of the Problems aren’t Tied to the Database

I have always wondered how Sherlock Holmes cracked the toughest of mysteries every single time. This fascination always lead me to learn new techniques every single time. The backbone for every successful detective is to be logical in their approach while working on the case. While watching movies, I have always seen a twist in the plot always makes the movie interesting and we are glued to the seat. If there is no drama, high octane action, restrictions – the movie would be boring. These techniques apply to real life too.

Being an application developer can always be demanding. You never know what situations we will get pushed into the next day. Here is one such situation which was just like a detective work. Once I was called for performance tuning exercise of our application that was running healthy for more than a year. The distress call from the customer got me thinking. What could have gone wrong in the application? Why all of a sudden this is happening? Did we release any upgrade or update to the database in the recent past as release? None of these were making sense as I kept thinking about possible root cause analysis. As soon as I landed at the customer site, I starting my interrogation with the DBA.

Pinal: Hello Sir!

DBA: Hi Pinal. Nice to see you here. Thanks for making time.

Pinal: It is a pleasure. I am surprised I had to come in such a short notice. Something must be really wrong then.

DBA: Yes. Our application users are complaining of very bad performance of your application for the past 2 weeks.

Pinal: Was it running fine before that?

DBA: I think, it must have been. They haven’t complained about it for more than a year. This seems to be a recent phenomenon. I think the application data must have increased considerably for this behavior.

Pinal: Hmmm … How is it possible that the application data increased all of a sudden in the past 2 weeks?

DBA: I am not sure. You are the expert. I am just a DBA and not an application developer. Must be something in your code.

Pinal: Can I get a chance to see the server console once?

DBA: Well, you know we are very sensitive with data privacy. I can surely get you access to an SSMS console. That must not be a problem.

Pinal: I can’t see the server even once?

DBA: Wait, let me get you the network / systems admin for this. They must know a way out.

After sometime, I get a chance to meet the Systems admin and continue the conversation.

Pinal: Hi there!

Admin: Hi. How can I be of help?

Pinal: I was wondering, if I can get access to the SQL Server box for my application?

Admin: Well, it is against our policy to open up the server directly. But what do you want to do?

Pinal: Ok. I am not sure what is going wrong on your server. Our application was running fine till about 2 weeks back. And now users are complaining about slowness in performance. I wanted to know what has changed in this timeframe all of a sudden.

Admin: Oh yeah, I forgot to tell you. We are in the process of doing server consolidation and are moving lot of our workloads from Physical to Virtual environment. And I know, we did move your database into a virtual environment.

Pinal: Cool. Are the configurations same as you moved these servers?

Admin: We made sure they are identical. So please don’t worry Pinal. It is something about your application code mostly.

Pinal: hmmm … Sure let me check. Thanks for your time.

I have always been a believer, accept but validate. There is nothing that can confirm your hypothesis but by doing the analysis all by yourself. So I bid a goodbye to Systems Admin and started my conversation with the DBA. I turned around and asked – “What do you use to monitor your SQL Server environments my friend?” He suggested the use of Spotlight on SQL Server for SQL Server. I felt so relieved that they had something.

Spotlight on SQL Server for SQL Server to the rescue

Having some monitoring tool is essential when working with SQL Server. Though there are a number of them, I had had my opportunity to look at Spotlight on SQL Server in detail in the past. It surely has come a long way and has nifty features that are worth a note.

In our conversation above, I was sure there was something fishy about the configuration the Systems Admin have given to SQL Server and I wanted to understand what are the complete Hardware Configuration for our database server.

The first report I pulled out as part of my diagnosis was “Hardware Configurations”. Voila, I had my answers right in front of my eyes and I was delighted my work was complete in less than 15 mins within the organization.

In the previous configuration (physical configuration), they used an 8 Processor machine for the application’s database server. As part of the consolidation, the network team accidentally made it a single CPU machine and it deprived the application of essential CPU cycles. So in a single click I was able to get the answers. I didn’t look further because all the physical hardware configurations were in black-and-white in this dashboard for me.

In fact, if we use the SQL Server Management studio’s dashboard we can get limited information. But nothing as exhaustive as what I got to see as part of data collection on Spotlight on SQL Server. The Server Dashboard does have the “# of CPU’s” values embedded into the report.

Conclusion

A lot of times we get misdirected with the problem in hand. These take us into a spin of wrong diagnosis. Don’t always work on the symptoms of the problem. Try to find out all the data points that lead to the problem and work your way out in solving the same. In our above example, the problem was never about performance of our applications. It was about wrong configuration and wrong migration done by the network team. It is essential we are aware of these pitfalls.

Try Spotlight on SQL Server free for 30 days!

Spotlight on SQL Server continues to break new ground when it comes to providing DBAs with the ultimate in flexibility to suit their specific needs. Building on the success of the Wait Stat analysis workflow, Spotlight on SQL Server version 11.0 now includes multi-dimensional workload analysis to provide maximum insight into performance issues, an enhanced mobile capability to remotely diagnose issues directly from mobile phones and the introduction of SNMP traps. A fresh new look provides simple to use, powerful visualization to effectively monitor the health of your SQL Server environment.

If you’re looking for additional help on how to optimize your SQL Servers, Visit Dell Software’s SQL Server Resource Center for video, guides and other useful content.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – How to use Procedure sp_user_counter1 to sp_user_counter10

There are many performance counters available in SQL Server which can be used to monitor various parameters of SQL Server engine. Have you ever been into a situation where you want to see value in performance counter for a query which you have returned? Imagine a situation where you want to look at fragmentation or numbers of rows in a table over a period of time. And you want to plot the values against standard performance counters available.

SQL Server had this feature from a long time. I noticed it recently while working with one of my friend. He was showing me the impact of a number of database vs the number of threads in the AlwaysOn availability group. He showed me graph and my first question was – how did you plot the number of databases with that counter? The answer was – using a user counter. I did further research and learned something new so sharing that with you in the form of this blog.

There are ten system procedures which are available in SQL Server, which allows user to set specific counter values. They are sp_user_counter1, sp_user_counter2… till sp_user_counter10. That’s why they are called user settable counters. Below is the screenshot of Performance Monitor. (Start > Run > PerfMon)

These counters can be populated using stored procedures. Counter Object name and stored procedure names are same. This means the values for “User Counter 1” can using sp_user_counter1, values for “User counter 2” can using sp_user_counter2 and so on.

Here is a quick example to demonstrate the usage.
-- Drop database, if already present
USE MASTER
GO
IF EXISTS (
SELECT *
FROM sys.databases
WHERE NAME = 'SQLAuthority'
)
BEGIN
ALTER DATABASE
SQLAuthority SET single_user WITH ROLLBACK IMMEDIATE;
DROP DATABASE SQLAuthority;
END
GO
-- Create New Database
CREATE DATABASE SQLAuthority
GO
-- Create Objects Needed for Demo
USE SQLAuthority
GO
CREATE TABLE TrackMe (i INT)
GO
-- while loop to populate the data and populate the counter
SET NOCOUNT ON
DECLARE
@NumberOfRows INT=0,@NumberOfRows_2 INT=0
WHILE (@NumberOfRows<25)
BEGIN
INSERT INTO
TrackMe VALUES (1), (2), (3)
SELECT @NumberOfRows = COUNT(*) FROM TrackMe
EXECUTE sp_user_counter1 @NumberOfRows        -- 3, 6, 9, 12...
WAITFOR delay '00:00:02'
SELECT @NumberOfRows_2 = @NumberOfRows*2
EXECUTE sp_user_counter2 @NumberOfRows_2    -- 6, 12, 18, 24...
WAITFOR delay '00:00:02'
END
-- loop finished reset the counters
EXECUTE sp_user_counter1 0
EXECUTE sp_user_counter2 0

Here is the Performance counter graph where I have captured the counters.

Let me explain little logic of the script. My script is plotting number of rows in the table as counter1 and double of that as counter2. There is a delay of 2 seconds added between each counter so that we can see steps. Once the number of rows in the table are around 50, I am ending the loop and resetting the values to zero. That’s when you are seeing a drop at the end.

If you look at the procedure, it is straight forward. Believe me, here is the code.

CREATE PROCEDURE sys.sp_user_counter1 @newvalue INT AS
DBCC
setinstance ('SQLServer:User Settable', 'Query', 'User counter 1', @newvalue)

Which means, we can also call undocumented command DBCC SETINSTANCE directly to set value for the counters.

Can you think of more innovative use of drawing value in performance monitor?

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Performance Counter Missing: How to Get Them Back?

Of the thousands of mails I receive every day about SQL Server problems, I was recently pinged by a friend who reported a weird problem. He started with a simple question. He said that he wants to monitor SQL Server Performance counters for complete day to send a report back to his manager.  That was a simple one and I asked to capture performance counter data of SQL Server using performance monitor tool (PerfMon.exe). I thought the solution was done and was about to close the chat window that I was questioned for the second time. This was tough one I thought – “I am not seeing any performance counter for my instance”.

I asked him to send a screenshot and instance details. Here is how his perfmon counters screen looks like. (Start > Run > Perfmon.exe) and then Right Click “Add Counters…”

Since we are dealing with a default instance of SQL Server, we should see “SQL Server:Access Methods” as the first counters (they are alphabetical). This got me curious and I wanted to really understand why this could ever happen. I politely asked my friend if he was ready to do some sort of screen sharing at a later date.

I looked at the SQL Server ERRORLOG file first and there was nothing interesting under that. I asked to query sys.dm_os_performance_counters to check if we have counter values there. Here is the screenshot.

We can see that counters are available in SQL Server Engine but not shown in performance monitor tool.

Asked to look into below key:

HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\MSSQLSERVER\Performance

We were not seeing many registry keys as compared to my system.

Knowing something has gone wrong terribly, the only option left at this point was to reload the counters. Here are the commands.

To unload counter

Default Instance: unlodctr MSSQLSERVER

Named Instance: unlodctr MSSQL$<InstanceName>

To load the counter we can look at the same key and look at the value of “PerfIniFile” which is “perf-MSSQLSERVERsqlctr. ini” in the above screenshot. The file is located under BINN folder. For my machine, it is “E:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\Binn”

To load counter

Default Instance:

lodctr “E:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\Binn\perf-MSSQLSERVERsqlctr.ini”

 

For named instance, we need to check the file and path and run below (my machine has named instance of SQL Server 2014 called SQL2014)

lodctr “E:\Program Files\Microsoft SQL Server\MSSQL12.SQL2014\MSSQL\Binn\perf-MSSQL$SQL2014sqlctr.ini”

Once that is done, we should be able to see the counters (shown below)

A big sigh of relief as this was a great learning and sharing time for me and I was able to help my friend. I am sure this was helpful to you too, if you ever encounter this situation. Do let me know.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Search Records with Single Quotes – SQL in Sixty Seconds #075

Earlier I wrote two blog posts about Search Records with Single Quotes in two parts. Refer the blog posts over here.

Though I had clarified that all the methods displayed in these two blog posts have the exact same performance, I kept on getting question on this subject, again and again! Well, as there are so many questions, I have decided to create a quick video which demonstrates that there is no performance difference among the four methods which I have displayed earlier.

Action Item

Here are the blog posts I have previously written. You can read it over here:

You can subscribe to my YouTube Channel for frequent updates.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Do You Know Your Data’s Classification? – Notes from the Field #050

[Notes from Pinal]: Data is a very simple word, yet it is very powerful. There is a famous saying – Know Your Data. I have quite often found that developers do not know their data, they are often confused with the same and not sure how to answer that. My friend Tim Radney is an amazing person who usually have answer to each of the questions which looks simple but are complicated in reality.

Linchpin People are database coaches and wellness experts for a data driven world. In this 50th episode of the Notes from the Fields series database expert Tim Radney (partner at Linchpin People) explains in a very simple word. Please follow Tim on his twitter handle at @tradney.


Do you know your data’s classification?

As data professionals, we have many responsibilities. We could be responsible for backing up and restoring data, writing reports, building queries, writing stored procedures, tuning workloads, or any vast number of responsibilities.

If you are in the business of granting access to the data either through reports/queries or provisioning login access, you should be aware of the type of data you are granting access to. Your company likely has policies in place that should guide how access to certain classifications of data should be handled.

Some of the more common types of data that your company would have stricter controls over would be related to PII, PCI, SOX, GLBA, or HIPPA. You should know what the guidelines are within your company for access to this data and help make sure that those standards are being upheld.

These data assets may require additional auditing on who has access to view, print, export, etc. When the data is viewed it may require water marks or headers/footers to be on any reports.

Your organization may require certain types of data, such as PCI to be audited on who is viewing certain elements of that data.

Worst, your organization may need to be doing these things but are not. If not, you should be asking why and helping to implement a data governance plan.

If you want me to take a look at your server and its settings, or if your server is facing any issue we can Fix Your SQL Server.

Note: Tim has also written an excellent book on SQL Backup and Recovery, a must have for everyone.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Row Offset in SQL Server For Different Version

A very common question is how to find row offset in SQL Server. Here are a few examples based on different version of SQL Server. I have included SQL Server 2000 event, though it is almost 15 years old product and I encounter is less and less every day on production server.

I have used database adventureworks for example.

USE AdventureWorks2014
GO
--------------------------------------------------
-- SQL Server 2012/2014
--------------------------------------------------
DECLARE @RowsPerPage INT = 10, @PageNumber INT = 6
SELECT SalesOrderDetailID, SalesOrderID, ProductID
FROM Sales.SalesOrderDetail
ORDER BY SalesOrderDetailID
OFFSET
(@PageNumber-1)*@RowsPerPage ROWS
FETCH NEXT @RowsPerPage ROWS ONLY
GO
--------------------------------------------------
-- SQL Server 2008 / R2
-- SQL Server 2005
--------------------------------------------------
DECLARE @RowsPerPage INT = 10, @PageNumber INT = 6
SELECT SalesOrderDetailID, SalesOrderID, ProductID
FROM (
SELECT SalesOrderDetailID, SalesOrderID, ProductID,
ROW_NUMBER() OVER (ORDER BY SalesOrderDetailID) AS RowNum
FROM Sales.SalesOrderDetail ) AS SOD
WHERE SOD.RowNum BETWEEN ((@PageNumber-1)*@RowsPerPage)+1
AND @RowsPerPage*(@PageNumber)
GO
--------------------------------------------------
-- SQL Server 2000
--------------------------------------------------
DECLARE @RowsPerPage INT
DECLARE
@PageNumber INT
SET
@RowsPerPage = 10
SET @PageNumber = 6
SELECT SalesOrderDetailID, SalesOrderID, ProductID
FROM
(
SELECT TOP (@RowsPerPage)
SalesOrderDetailID, SalesOrderID, ProductID
FROM
(
SELECT TOP ((@PageNumber)*@RowsPerPage)
SalesOrderDetailID, SalesOrderID, ProductID
FROM Sales.SalesOrderDetail
ORDER BY SalesOrderDetailID
) AS SOD
ORDER BY SalesOrderDetailID DESC
) AS SOD2
ORDER BY SalesOrderDetailID ASC
GO

The result of the above queries is identical to each other.

This blog post is based on an earlier blog post which had a few errors corrected in this blog.

Reference: Pinal Dave (http://blog.sqlauthority.com)