SQL SERVER – Puzzle – Working with functions to Concatenate columns

Every now and then I get pulled into coding review sessions and those are some of the fun times I every have working with T-SQL of any sorts. This blog is inspired by one such time where I saw at a customer location a code that was lengthy and in one location I was seeing close to 15 fields which were concatenated to form a single column as part of output.

I generally don’t get into the reasoning of it, but what was interesting is they were using ISNULL in front of every column because they were not sure how whole column values to be returned as NULL. To give you an idea, here is a mini-repro to the same problem.

CREATE TABLE #temp (
emp_name NVARCHAR(200) NOT NULL,
emp_middlename NVARCHAR(200) NULL,
emp_lastname NVARCHAR(200) NOT NULL
);
INSERT INTO #temp VALUES( 'SQL', NULL, 'Authority' );
INSERT INTO #temp VALUES( 'SQL', 'Server', 'Authority' );
SELECT emp_name + emp_middlename + emp_lastname AS Result
FROM #temp;
DROP TABLE #temp;
GO

Now, you can see the output by default will look like:

concatenate quiz 01 SQL SERVER   Puzzle   Working with functions to Concatenate columns

As you can see, the problem is getting the first value as NULL. Customers sometimes don’t want this behavior.

Question for the Day

What would you do in this situation? Would you add an ISNULL in front of every column because you are not sure which of these might be NULL? Can you think of a solution without using the ISNULL function?

Give me your solution over comments. It is very easy – trust me. In my example the customer was also trying to use various Datatypes concatenating into a single column.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Using Project Connections in SSIS – Notes from the Field #088

[Notes from Pinal]: SSIS is very well explored subject, however, there are so many interesting elements when we read, we learn something new. A similar concept has been Using Project Connections in SSIS.

Tim Mitchell SQL SERVER   Using Project Connections in SSIS   Notes from the Field #088Linchpin People are database coaches and wellness experts for a data driven world. In this 88th episode of the Notes from the Fields series database expert Tim Mitchell (partner at Linchpin People) shares very interesting conversation related to how to use raw files in SSIS.


In SQL Server Integration Services, connection managers are used as gateways for most any external read and write operation. Connection managers are type- and format-specific, and in the case of relational database connection managers, they are usually specific to vendor (Oracle, DB2, etc.) as well.

In most use cases, the same connection will be used across multiple packages in the same project. In pre-2012 versions of SSIS, each package would have its own connection manager for every connection used in that package. Creating and maintaining all those connection managers could be time-consuming as the number of packages grows. In SQL Server 2012, Microsoft added project connections to SSIS, allowing for the creation of connections that were accessible across all packages in a project. Instead of having to create a copy of each connection manager in every package, developers can now simply create the connection at the project level. Project connections will automatically show up in the connection manager tray for all packages in that project.

n 88 1 SQL SERVER   Using Project Connections in SSIS   Notes from the Field #088

As shown, any project connection automatically has the designation (project) prepended to the name to clearly indicate that it is a project connection. Those without this designation are package connections, and are only accessible from within that package.

Project connections will also appear in the Solution Explorer window, under the Connection Managers tab.

n 88 2 SQL SERVER   Using Project Connections in SSIS   Notes from the Field #088

You can create a new project connection by right-clicking on the Connection Managers node shown above, and walking through the steps to build the connection. Similarly, you can edit or delete an existing project-level connection manager from this same window.

You can also promote an existing package connection to a project connection by right-clicking on the package connection and selecting Convert to Project Connection.

n 88 3 SQL SERVER   Using Project Connections in SSIS   Notes from the Field #088

Coincidentally, you can also convert back to a package connection through a similar process. Right-clicking on a project connection will expose an option to Convert to Package Connection. However, you have to be careful when choosing this option. If you convert a project connection to a package connection, that connection will then be visible only in the package in which you are currently working. If you have used the connection in any other packages in that project, those operations will fail because the connection is no longer visible at the project level. You will get a warning message when you attempt to convert a project connection to a package connection.

n 88 4 SQL SERVER   Using Project Connections in SSIS   Notes from the Field #088

Finally, if you are using project connections, you can still use dynamic properties such as expressions and parameters. Do be aware that, if you use parameters to configure a project connection, you must use project parameters rather than package parameters. The latter is not accessible beyond the scope of a single package, and therefore would not always be accessible for project connections. Fortunately, the UI for the expression builder limits you to only project parameters when configuring project connections.

n 88 5 SQL SERVER   Using Project Connections in SSIS   Notes from the Field #088

In conclusion, the project connection in SSIS is an excellent tool for configuring connections at the project level to minimize the extra work required for sharing connections across multiple packages in a project.

If you want me to take a look at your server and its settings, or if your server is facing any issue we can Fix Your SQL Server.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – 2016 FIX: Install – Rule “Oracle JRE 7 Update 51 (64-bit) or higher is required” failed

I enjoy spending time with my daughter almost every single day. It keeps me busy at home and most importantly, some quality time with her allows me to relax after a long day at work. But there one thing that has caught my attention every single day I have been with her – she is curious about every single thing that she sees. With every new release of SQL Server, I sort of get into the same mode or feeling. I become a child as there is so much to explore and learn.

I heard about new a lot of new features in SQL Server 2016. So, I downloaded media from Microsoft site and tried installing on my VM. I have selected all the features and moved forward. This is common because I don’t want to miss out on anything that gets introduced in the new release. Later, during the installation process, I faced an error and I was not able to proceed.

Here is the Rule check failure.

Ora JRE 01 SQL SERVER   2016 FIX: Install – Rule Oracle JRE 7 Update 51 (64 bit) or higher is required failed

I was wondering what part of SQL Server required us to have Oracle JRE. If I click on hyperlink on failed, I get below:

Ora JRE 02 SQL SERVER   2016 FIX: Install – Rule Oracle JRE 7 Update 51 (64 bit) or higher is required failed

Here is the text of the message:

—————————
Rule Check Result
—————————
Rule “Oracle JRE 7 Update 51 (64-bit) or higher is required” failed.
This computer does not have the Oracle Java SE Runtime Environment Version 7 Update 51 (64-bit) or higher installed. The Oracle Java SE Runtime Environment is software provided by a third party. Microsoft grants you no rights for such third-party software. You are responsible for and must separately locate, read and accept applicable third-party license terms. To continue, download the Oracle SE Java Runtime Environment from http://go.microsoft.com/fwlink/?LinkId=526030.
—————————
OK
—————————

NOTE: On little exploring, found that you can skip this error if you don’t want to use PolyBase feature. You can go back and uncheck highlighted feature to avoid this check.

Ora JRE 03 SQL SERVER   2016 FIX: Install – Rule Oracle JRE 7 Update 51 (64 bit) or higher is required failed

If you want to use PolyBase then to solve above error, you can follow below steps:

  1. Go to http://www.oracle.com/technetwork/java/javase/downloads/index.html
  2. Click on Download on JRE as highlighted below

Ora JRE 04 SQL SERVER   2016 FIX: Install – Rule Oracle JRE 7 Update 51 (64 bit) or higher is required failed

  1. Then click on “Accept License Agreement” as shown below

Ora JRE 05 SQL SERVER   2016 FIX: Install – Rule Oracle JRE 7 Update 51 (64 bit) or higher is required failed

  1. Once accepted, I have downloaded the one which says Windows x64 because my operating system is windows 64 bit. Note that version (8 update 45) will change in future but steps would remain same.

Ora JRE 06 SQL SERVER   2016 FIX: Install – Rule Oracle JRE 7 Update 51 (64 bit) or higher is required failed

  1. After installing, I was able to move forward and install SQL Server 2016.

Hope this helps those trying to install SQL Server 2016 with Polybase features. I am sure you will be able to solve this.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Identifying Blocking Chain Using SQL Scripts

There are a number of blog posts that have been written on this very topic. This can be from basics to almost advanced in many ways. I am outlining some of them here below:

If these were not enough, I wanted to play around with some SQL Scripts for this basic deadlock behavior. The most basic script I have been using and used by many DBA will include Activity Monitor, sp_who2, sysprocesses etc.

Recently I saw a script written by a DBA friend:

SELECT * FROM sys.dm_os_waiting_tasks
WHERE blocking_session_id IS NOT NULL
GO

This shows the rows like:

 SQL SERVER   Identifying Blocking Chain Using SQL Scripts

This is so cool because I know who is waiting for whom. In the above image 53 is waiting for 68. While 79 is waiting for 53. This was a cool way to look at things.

I wanted to show the same data in slightly different way using T-SQL. So I wrote a Blocking Tree TSQL script.

SET NOCOUNT ON
GO
SELECT SPID, BLOCKED, REPLACE (REPLACE (T.TEXT, CHAR(10), ' '), CHAR (13), ' ' ) AS BATCH
INTO #T
FROM sys.sysprocesses R CROSS APPLY sys.dm_exec_sql_text(R.SQL_HANDLE) T
GO
WITH BLOCKERS (SPID, BLOCKED, LEVEL, BATCH)
AS
(
SELECT SPID,
BLOCKED,
CAST (REPLICATE ('0', 4-LEN (CAST (SPID AS VARCHAR))) + CAST (SPID AS VARCHAR) AS VARCHAR (1000)) AS LEVEL,
BATCH FROM #T R
WHERE (BLOCKED = 0 OR BLOCKED = SPID)
AND EXISTS (
SELECT * FROM #T R2 WHERE R2.BLOCKED = R.SPID AND R2.BLOCKED <> R2.SPID)
UNION ALL
SELECT R.SPID,
R.BLOCKED,
CAST (BLOCKERS.LEVEL + RIGHT (CAST ((1000 + R.SPID) AS VARCHAR (100)), 4) AS VARCHAR (1000)) AS LEVEL,
R.BATCH FROM #T AS R
INNER JOIN BLOCKERS ON R.BLOCKED = BLOCKERS.SPID WHERE R.BLOCKED > 0 AND R.BLOCKED <> R.SPID
)
SELECT N'    ' + REPLICATE (N'|         ', LEN (LEVEL)/4 - 1) +
CASE WHEN (LEN(LEVEL)/4 - 1) = 0
THEN 'HEAD -  '
ELSE '|------  ' END
+ CAST (SPID AS NVARCHAR (10)) + N' ' + BATCH AS BLOCKING_TREE
FROM BLOCKERS ORDER BY LEVEL ASC
GO
DROP TABLE #T
GO

The output would look like:

 SQL SERVER   Identifying Blocking Chain Using SQL Scripts

This is a cool way to look at the same Blocking data inside SSMS. I have taken a simple way to show this script. I am sure this can be beautified more with your help. So if you use this and modify the same. Do post them over comments too. That is a great way to enhance the Blocking Tree Script.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Creating Dataset Using VALUES Clause Without Creating A Table

You know that you can add data to a table using VALUES clause. But did you know that you can create a dataset using VALUES clause like a table without adding into another table?

Suppose you want to create a data set with two columns named a and b

SELECT *
FROM (VALUES (1,2)) AS t(a,b)

If you execute the above query, you get the following resultset

multirow1 SQL SERVER   Creating Dataset Using VALUES Clause Without Creating A Table

If you want multiple rows, you can add data seperated by comma as shown below

SELECT *
FROM (VALUES (1,2),(3,4),(340,455)) AS t(a,b)

The result is

multirow2 SQL SERVER   Creating Dataset Using VALUES Clause Without Creating A Table

This is very handy if you want to have a small lookup table that can be matched with another table

Please note that this method of using VALUES clause will work starting from version 2008 onwards

Reference: Pinal Dave (http://blog.sqlauthority.com)

Interview Question of the Week #027 – Move TempDB from One Drive to Another Drive When Low Disk Space

I am often invited to attend various interviews and once in a while, I see practical questions discussed in the interview. Here is one of the important interview question which is related to TempDB.

Question – “What will you do when your log file of TempDB is full?”

Answer – The best solution when Log file of TempDB is full is to move it to a new drive. Here is more details about the same.

Make sure that TempDB is set to autogrow and do not set a maximum size for TempDB. If the current drive is too full to allow autogrow events, then arrange a bigger drive, or add files to TempDB on another device (using ALTER DATABASE as described below and allow those files to autogrow.

Move TempDB from one drive to another drive. There are major two reasons why TempDB needs to move from one drive to another drive.
1) TempDB grows bigger and the existing drive does not have enough space.
2) Moving TempDB to another file group which is on a different physical drive helps to improve database disk read, as they can be read simultaneously.

Follow direction below exactly to move the database and log from one drive (c:) to another drive (d:) and (e:).

Open Query Analyzer and connect to your server. Run this script to get the names of the files used for TempDB.

USE TempDB
GO
EXEC sp_helpfile
GO

Results will be something like:
name fileid filename filegroup size
——- —— ————————————————————– ———- ——-
tempdev 1 C:Program FilesMicrosoft SQL ServerMSSQLdatatempdb.mdf PRIMARY 16000 KB
templog 2 C:Program FilesMicrosoft SQL ServerMSSQLdatatemplog.ldf NULL 1024 KB

Along with other information related to the database. The names of the files are usually tempdev and templog by default. These names will be used in the next statement. Run following code, to move mdf and ldf files.

USE MASTER
GO
ALTER DATABASE TempDB MODIFY FILE
(NAME = tempdev, FILENAME = 'd:\datatempdb.mdf')
GO
ALTER DATABASE TempDB MODIFY FILE
(NAME = templog, FILENAME = 'e:\datatemplog.ldf')
GO

The definition of the TempDB is changed. However, no changes are made to TempDB till SQL Server restarts. Please stop and restart SQL Server and it will create TempDB files in new locations.

You can read more about this over here.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Secure by Default – The Mantra

AskTheExperts SQL SERVER   Secure by Default   The MantraGenerally the talks I give at conferences revolve around performance and tuning. These are always the most sought after topic and I don’t disappoint the requests because they give me an opportunity to learn something new too. Last month when I visited Sri Lanka for the first ever SQLSaturday event, the response was the same. We had a fantastic Q&A discussion which involved some random questions from attendees on a variety of topics. I think every conference should have something like this – “Ask the Experts” concept. This gets everyone together and we are having a combined fun time to say the least.

One of the topic one attendee asked was around Security. “Hey Pinal, can you recollect some top 10 things we need to take care as security practices when working with SQL Server?”. It was surely a loaded question and I was not prepared to say the least. I said, “Give me a breather buddy. I can actually do a complete end-to-end session when I come next time if you are interested. (this gave me some seconds to gather my thoughts) Now that you asked, let me throw some of them for your reference.”

The session got over and the gentleman who had asked the question had sent me a great summary of the conversation during the Q&A session. I thought to post them as a blog post so that we can elaborate over them for the future. But these are some great starters for reference.

  • Application Layer Security: SQL Injection is a powerful method to damage or compromise a SQL Server. SQL Injection is the result of failing to validate user input at the application layer. So please validate all the user inputs every single time.
  • Be strict about what represents valid input and reject everything else.
  • Use parameterized queries and avoid string concatenation to build queries.
  • Do not divulge too much information to the end user. The end user may be an attacker. Let your error message not reveal the exact problem.
  • Connect to the server using a least privilege account. Do not connect with System Admin privileges. I know many of you are doing this today, but please refrain henceforth.
  • Secure the communication channel between Client and SQL Server. SSL encryption be mandated for client server communication if this is critical. SSL is provided through the use of certificates issued by a Certificate Authority (CA). SSL encryption ensures the packets between the client and server are encrypted and mitigates the detection of user credentials or data by a network sniffer.
  • Protect the SQL Server files on the physical level. The directory structure for SQL Server should be restricted to the SQL Server service and relevant accounts.
  • Protect the permissions for the registry entries for SQL Server. They should be restricted to Administrators and the SQL Server account.
  • Audit Logins by default. Logins should be audited at the Windows and SQL Server level. At the least, failed login attempts should be logged as a flag to possible intrusion attempts.

Tips shared about SQL Account Management

  • Make the SQL Server accounts least privileged.
  • Delete or disable any unused user accounts.
  • Disable the Windows guest account (this should be automatic for a server).
  • Enforce strong password policies. This is normally controlled via Group Policy.
  • Restrict remote logins to the server.
  • Rename administrator account to stop DDOS attacks. Accounts like “sa” must be disabled/renamed at a minimum.

Seeing the mail, I thought this is a great start to talk when it comes to security and SQL Server. I thought this blog will be one of many to follow later on this topic because I am sure a number of you will also be interested in this topic. Do let me know some of the things you do in your environments so that we can share and learn from each other.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Difference Between Read Committed Snapshot and Snapshot Isolation Level

It is quite common that when a new feature gets introduced, there is a lot of talk about them. Sometimes even the old capabilities are lesser known because they can be used interchangeably in conversations. I have seen many DBAs getting confused between these two terms because both of them have a word snapshot in it.

  1. Read Committed Snapshot
  2. Snapshot Isolation level

I must point out that first one is NOT an isolation level. It is the behavior in reading committed isolation level, which gets activated only if we turn on database level property. Here are the commands to change them.

  1. ALTER DATABASE SQLAuthority SET READ_COMMITTED_SNAPSHOT ON
  2. ALTER DATABASE SQLAuthority SET ALLOW_SNAPSHOT_ISOLATION ON

If you are having problem with blocking between readers (SELECT) and writers (INSERT/UPDATE/DELETE), then you can enable first property without changing anything from the application. Which means application would still run under read committed isolation and will still read only committed data.

Let us have a look at this with an example:

Data Preparation

SET NOCOUNT ON
GO
USE MASTER
GO
IF DB_ID('SQLAuthority') IS NOT NULL
BEGIN
ALTER DATABASE
SQLAuthority   SET single_user WITH ROLLBACK IMMEDIATE
DROP DATABASE
SQLAuthority
END
GO
CREATE DATABASE SQLAuthority
GO
USE SQLAuthority
GO
CREATE TABLE DemoTable(
i INT
,j VARCHAR(100)
)
GO
USE SQLAuthority
GO
INSERT INTO DemoTable VALUES (1, 'ONE')
GO
INSERT INTO DemoTable VALUES (2, 'TWO')
GO
INSERT INTO DemoTable VALUES (3, 'THREE')
GO

Read Committed (Default Behavior)

Session 1

USE SQLAuthority
GO
BEGIN TRAN
UPDATE
DemoTable
SET i = 4
WHERE i = 1

Session 2

USE SQLAuthority
GO
BEGIN TRAN
SELECT
*
FROM   DemoTable
WHERE i = 1

Result – Query in Session 2 is blocked.

Session 1

ROLLBACK

Result – Query in Session 2 would be unlocked and would show result. It would show last committed data. Since we have done rollback in session 1, we would see original values. (1, ONE)

Read Committed Snapshot

Change the database property as below

ALTER DATABASE SQLAuthority
SET READ_COMMITTED_SNAPSHOT ON WITH ROLLBACK IMMEDIATE
GO

Session 1

USE SQLAuthority
GO
BEGIN TRAN
UPDATE
DemoTable
SET i = 4
WHERE i = 1

Session 2

USE SQLAuthority
GO
BEGIN TRAN
SELECT
*
FROM   DemoTable
WHERE i = 1

Result – Query in Session 2 shows old value (1, ONE) because current transaction is NOT committed. This is the way to avoid blocking and read committed data also.

Session 1

COMMIT

Session 2

USE SQLAuthority
GO
SELECT *
FROM   DemoTable
WHERE i = 1

Result – Query in Session 2 shows no rows because row is updated in session 1. So again, we are seeing committed data.

Snapshot Isolation Level

This is the new isolation level, which was available from SQL Server 2005 onwards. For this feature, there is a change needed in the application as it has to use a new isolation level.

Change database setting using below. We need to make sure that there is no transaction in the database.

ALTER DATABASE SQLAuthority SET AllOW_SNAPSHOT_ISOLATION ON

Now, we also need to change the isolation level of connection by using below

Session 1

USE SQLAuthority
GO
BEGIN TRAN
UPDATE
DemoTable
SET i = 10
WHERE i = 2

Session 2

SET TRANSACTION ISOLATION LEVEL SNAPSHOT
GO
USE SQLAuthority
GO
BEGIN TRAN
SELECT
*
FROM   DemoTable
WHERE i = 2

Result- Even if we have changed the value to 10, we will still see old record in session 2 (2, TWO).

Now, let’s commit transaction in session 1

Session 1

COMMIT

Let’s come back to session 2 and run select again.

Session 2

SELECT *
FROM   DemoTable
WHERE i = 2

We will still see the record because session 2 has stated the transaction with snapshot isolation. Unless we complete the transaction, we will not see latest record.

Session 2

COMMIT
SELECT
*
FROM   DemoTable
WHERE i = 2

Now, we should not see the row as its already updated.

In summary, SQL Server 2005 onwards, SQL engine provides only one new isolation level AND an optimistic implementation of READ COMMITTED. Isolation level SNAPSHOT is a new isolation level and READ COMMITTED SNAPSHOT is the same isolation level as READ COMMITTED but is the optimistic implementation of it.

I would be glad and happy to know the scenarios you have used to play around with Snapshot isolation for your servers. Do let me know via comments below.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Why Haven’t You Disabled CPU Power Saving Yet?! – Notes from the Field #087

[Note from Pinal]: This is an 87th episode of Notes from the Field series. Every week, I personally wait for the notes from the fields from Mike because it contains lessons of life which directly impacts DBA and Developer’s life positively. Mike Walsh is an amazing person to meet and know. I am fortunate to be his friend. Every single time I meet him, I learn something as a human. He is always looking for an opportunity to improve the life of DBAs and Developers.

mikewalsh SQL SERVER   Why Haven’t You Disabled CPU Power Saving Yet?!   Notes from the Field #087Mike normally talks about something that impacts a client, impacts DBAs or a professional development topic. However, in today’s blog post Mike has something very interesting and very helpful to the ever DBA in the world.

Read the entire story in his own words.


Intel CPU Core i7 2600K SQL SERVER   Why Haven’t You Disabled CPU Power Saving Yet?!   Notes from the Field #087Today’s post is going to be a very short public service announcement that I shouldn’t have to make, but part of my job as a consultant at Linchpin People is going to customers and performing WellDBA exams. A WellDBA exam is basically an extensive health check of a SQL Server environment. We are looking for small and simple changes to help performance, reliability or manageability. We are looking for deeper changes that could be made to help in these areas and we are helping clients see what sort of architecture decisions and foundational decisions should be addressed to scale into the future.

We have a free checklist you can use to check your own servers also. It describes some of our most common findings and helps you get on the road to having a healthy and a scalable SQL Server environment. You can download it and review your own environment to make sure you are following best practices.

Power Saving?! Hasn’t the SQL Family Talked About That A Lot?!?

Yes! There are posts everywhere about it and why it isn’t great for SQL Servers. Even still – it is something we find on at least 70% of the WellDBA exams we provide to customers. These are customers of all sorts and sizes with all manner of DBA teams or accidental DBAs at the helm.

So… Yes. A reminder about this setting.

What’s This All About?

Processors today have a Power Saving setting on them. They allow themselves to basically be told to operate at a lower performance profile to cut down on power costs. When the CPU gets really busy – the performance may be ramped up – then it ramps back down.

This sounds great in theory – but SQL Server isn’t normally just pegging the CPU – so the CPU performance may never go all the way up – and when it does, it often will go up just as the demand period is over and the boost isn’t needed.

The default on most Windows Server installations is “Balanced Mode” – this means out of the box you are allowing Windows to under clock your CPU experience and only increase it as/when/if needed.

Another way of saying it – you’ve taken your SQL Server instance – which you’ve licensed by CPU core nowadays – and forced it to run on lower power CPUs then you paid for when buying the server and paid for when licensing SQL Server.

What can you do?

Three simple steps really:

  1. See if you are running in balanced mode – You can go to start and then run and type in powercfg.cpl (or browse to control panel and then power settings). And see if you are running in balanced or High Performance.
  2. If running in balanced – change it to High Performance
  3. Consider making a group policy for your SQL Server servers and make sure they always run in High Performance mode.
  4. Celebrate that you are getting the performance you expected and paid for.

That’s it. I could pile on more. Over at the Linchpin blog, I put up a blog post a year ago that shows the impact of this setting and how to check it. There are more details and words there – but the important truth is found in this article that you are reading.

My plea for you is that you will just check this setting yourself and fix it for yourself. You won’t see queries take 200% less time, but you could get a boost in performance to some degree, sometimes even a 10-30% increase.

If you want to get started with performance analytics and Database Healthcheck of SQL Servers with the help of experts, read more over at Fix Your SQL Server.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Steps to Generate Windows Cluster Log?

Every now and then I sit next to DBA to learn some of their tricks. One of this recent learning is what I plan to share here. It is tough to learn everything all by ourselves, so learning is best done by sharing. There are many incidents with SQL cluster where we need to know what is happening with cluster and the best way to know history about cluster related event is to generate a cluster.log file. Event log has events related to Failover clustering but it is not very verbose. Typically, when SQL Resource doesn’t come online, I look at cluster log.

The command to generate cluster log depends on operating system the command with change. I am not an expert on cluster, but I generally use the commands to generate log and see if there is any glaring error message. In Windows Server 2003, the cluster log was written automatically, but Windows 2008 onwards the needs to be generated. I think it was a good decision by Microsoft to not to write on this all the time. Generation of cluster log is done using the cluster.exe command. Here are the steps

  • From one of the nodes of the cluster, open a Command Prompt with Administrator rights (Right click > Run As Administrator)
  • The easiest command to generate the log is to type cluster log /g and hit enter. Parameter g stands for generate.
  • A file with name cluster.log will be generated and stored in the %windir%\Cluster\Reports directory on each node of the cluster.

Here is the output on my lab machine having 4 nodes cluster. If any of the nodes is not working, it would continue with another node.

cluster.log 01 SQL SERVER   Steps to Generate Windows Cluster Log?

Here are the files generated in C:\Windows\Cluster\Reports

cluster.log 02 SQL SERVER   Steps to Generate Windows Cluster Log?

In later version, cluster.exe is not available (unless you install legacy components) and then you have to use PowerShell method of generating cluster log. The command which is required is Get-ClusterLog

PS C:\Windows\system32> Get-ClusterLog
The term ‘get-clusterlog’ is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
At line:1 char:15
+ get-clusterlog <<<<
+ CategoryInfo          : ObjectNotFound: (get-clusterlog:String) [], CommandNotFoundException
+ FullyQualifiedErrorId : CommandNotFoundException

cluster.log 03 SQL SERVER   Steps to Generate Windows Cluster Log?

If this is the error, then you need to import failover cluster module using “Import-Module FailoverClusters” command. Here is the screenshot. Since we have a 4 node cluster, we are seeing cluster.log four times.

cluster.log 04 SQL SERVER   Steps to Generate Windows Cluster Log?

Here is the syntax of the command from Get-Help Get-ClusterLog command.

SYNTAX
Get-ClusterLog [-InputObject <psobject>] [[-Node] <StringCollection>] [-Cluster <string>] [-Destination <string>] [-TimeSpan <UInt32>] [<CommonParameters>]

You can customize the file based on your need but to be very honest, I have never used the additional parameters.

Have you come across a situation where cluster log has helped you? I am sure in future more of these will get added and your experience in using the same would be of great help to others too. So do let us know if you did something similar in your servers too.

Reference: Pinal Dave (http://blog.sqlauthority.com)