SQL SERVER – FIX – Property IsLocked is Not Available for Login

I play and experiment with SQL Server to the max. Sometimes I am not sure what I did and I land into something undesired. Then the reason to troubleshoot starts and I get to learn something new. Sometimes these learnings are based on error messages that you folks send me. Recently I faced below error while going to properties of an SA account in SQL Server Management Studio.

locked 01 SQL SERVER   FIX – Property IsLocked is Not Available for Login

Here is the text of the error message.

Property IsLocked is not available for Login ‘[sa]’. This property may not exist for this object, or may not be retrievable due to insufficient access rights. (Microsoft.SqlServer.Smo)

It was clear that there is something wrong with SQL Login but interestingly it was NOT happening with all other SQL Login accounts. I have put profiler to find out what is happening in the background causing this error. I nailed it down to the problem that below query is returning NULL instead of 0 or 1.

SELECT LOGINPROPERTY('sa', 'IsLocked');

locked 02 SQL SERVER   FIX – Property IsLocked is Not Available for Login

Now, it was interesting to find why its returning NULL. Here are some possible reasons of such behavior. Later I realized that it’s happening with only those SQL Logins which are having “Enforced Password policy” enabled. Here is the query to check such accounts.

SELECT      name
FROM  sys.sql_logins
WHERE is_policy_checked = 1

Quick solution for this would be to disable the password policy check for such account using T-SQL below.

USE [master]
GO
ALTER LOGIN [sa] WITH CHECK_EXPIRATION=OFF, CHECK_POLICY=OFF
GO

A few things to check if you run into such issue on a regular basis and don’t want to disable password policy.

  1. Check if SQL Service account (the account running SQL Service in services.msc) is locked out. To verify that try logging in to some machine using service account.
  2. Check if Password policy validation is failing due to some issue with domain controller connectivity.

If it’s the first issue, then you need to contact the domain administrator to unlock service account. For the second issue, you can check security ring buffers.

SELECT  DATEADD(ms, dorf.[timestamp] - dosi.ms_ticks, GETDATE()) AS Date_Time
,CAST(record AS XML).value('(//SPID)[1]', 'bigint') AS SPID
,CAST(record AS XML).value('(//ErrorCode)[1]', 'varchar(255)') AS Error_Code
,CAST(record AS XML).value('(//CallingAPIName)[1]', 'varchar(255)') AS CallingAPIName
,CAST(record AS XML).value('(//APIName)[1]', 'varchar(255)') AS WindowsAPIName
FROM sys.dm_os_ring_buffers dorf
CROSS JOIN sys.dm_os_sys_info dosi
WHERE dorf.ring_buffer_type = 'RING_BUFFER_SECURITY_ERROR'
ORDER BY dorf.TIMESTAMP ASC

In my case it was a first issue, so unlocking account fixed the issue for me. Have you ever faced such an issue? What was the solution which worked for you? Let me know via your comments.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – What are T-SQL Median? – Notes from the Field #090

Kathi SQL SERVER   What are T SQL Median?   Notes from the Field #090[Note from Pinal]: In this episode of the Notes from the Field series database expert Kathi Kellenberger explains about T-SQL Median. Kathi is an amazing instructor, she was the SQL author I have read in my early career. The reason, I love SQL Server because her writing has instigated love for this technology in me. Today she brings a relatively interesting topic for database experts. Read the experience of  Kathi in her own words.


SQL Server has a wealth of useful T-SQL functions that let you do just about anything you need to do. One function that is sorely missing is a MEDIAN function. A median is the middle value in a set of values. For example, 3 is the median of the set 1, 3, 5. If the set is even, the median is calculated or “interpolated” by averaging the two middle values. The median of this set 1, 3, 5, 7 is 4.

You will see median values used frequently even if you do not work with statistics yourself. For example, you could see a news story listing the median household income of a city by neighborhood, the median salary for developers by language, or the median age of a population.

As I mentioned, there is no MEDIAN function in T-SQL. If, however, you are fortunate enough to be running SQL Server 2012 or 2014, there is an easy workaround. In 2012, Microsoft introduced a new function called PERCENTILE_CONT, one of the T-SQL window functions. Given a percent rank and a set of values, PERCENTILE_CONT will return the value ranked at that percent. If there is not an exact value found at the rank, PERCENTILE_CONT will interpolate the answer instead. If you supply 0.5, meaning 50%, PERCENTILE_CONT will return the median value. Here is an example that calculates the median sales by month for each year:

WITH Totals AS (
SELECT YEAR(OrderDate) AS OrderYear,
MONTH(OrderDate) AS OrderMonth,
SUM(TotalDue) AS TotalSales
FROM Sales.SalesOrderHeader
GROUP BY YEAR(OrderDate),
MONTH(OrderDate)
)
SELECT OrderYear, OrderMonth, TotalSales,
PERCENTILE_CONT(0.5) WITHIN GROUP(ORDER BY TotalSales)
OVER(PARTITION BY Totals.OrderYear) AS MedianSales
FROM Totals
ORDER BY OrderYear, TotalSales;

Before we take a look at the results, let’s review the query. The query has a common table expression (CTE) that calculates the sales for each month and year. In the outer query, the PERCENTILE_CONT function is used with an argument of 0.5. PERCENTILE_CONT is a T-SQL window function, but the syntax for it and a similar function, PERCENTILE_DISC, is different than the other window functions. The function name is followed by the WITHIN GROUP clause. This is where the ORDER BY expression goes, not in the OVER clause. The ORDER BY expression is used by the function to line up the values so that the ranks can be compared to the percentile rank value provided as an argument. The OVER clause supports PARTITION BY, and in this case we would like to get the median within each year, so the PARTITION BY expression is OrderYear.

In 2005, there were orders placed between July and December, six months. Six is even, so the function calculates the average of two middle months as shown in Figure 1.

notes 90 1 SQL SERVER   What are T SQL Median?   Notes from the Field #090

Figure 1: The Median Sales for 2005

Every year in the data has an even number of months except for 2008. In 2008, there are seven months with sales, so the function can find an exact median value. Figure 2 shows the 2008 sales:

notes 90 2 SQL SERVER   What are T SQL Median?   Notes from the Field #090

Figure 2: The Median Sales for 2008

You may be wondering what the difference is between PERCENTILE_CONT and PERCENTILE_DISC. PERCENTILE_DISC always returns an exact value found in the set, never a calculation. For calculating the median, PERCENTILE_CONT is the function to use.

notes 82 3 SQL SERVER   What are T SQL Median?   Notes from the Field #090If you would like to learn more about T-SQL window functions, be sure to check out my latest book Expert T-SQL Window Functions in SQL Server or my Pluralsight course T-SQL Window Functions.

If you want to get started with SQL Server with the help of experts, read more over at Fix Your SQL Server.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Generating Meaningful Test Data with dbForge Data Generator for SQL Server

Recently I faced with necessity of generating meaningful SQL data for testing purposes. During my search for a proper application for the task, I found out that Devart, a recognized developer of database solutions, released dbForge Data Generator for SQL Server, a GUI tool for fast generation of meaningful SQL data for development and testing purposes. Devart stated that with the tool, I will be able to:

  • Populate SQL Server tables with millions of rows of test data
  • Generate meaningful data that looks just like real data
  • Use 100+ predefined generators with sensible configuration options
  • Customize built-in generators
  • Create user-defined generators
  • Preview generation results
  • Save data population script for future reference
  • Automate data generation with the command-line interface

It sounded great, so I decided to give a quick test for the tool. Moreover, Devart offers a 30-day free trial for the most of its products, including Data Generator for SQL Server.

The first thing I noticed after running SQL Data Generator was clear user-friendly interface. The start page of the tool includes quick access to the main features and allows to quickly run recent projects, if any.

dbforge 1 SQL SERVER   Generating Meaningful Test Data with dbForge Data Generator for SQL Server

After running new data generation, I was prompted to select connection and database as well as to set default data generation options. I decided to generate meaningful data for few columns of the AdventrureWokrsTest database.

dbforge 2 SQL SERVER   Generating Meaningful Test Data with dbForge Data Generator for SQL Server

On the Options page, I selected to generate 2000 rows, and cleared the Include Null values option.

dbforge 3 SQL SERVER   Generating Meaningful Test Data with dbForge Data Generator for SQL Server

After that, Data Generator showed the generation document. I selected the following columns of the Person.Contact table for data population:

  • ContactID
  • Title
  • First Name
  • LatName
  • EmailAddress

Now, the time has come for selecting generators for each column. In dbForge Data Generator for SQL Server, all generators are split into several categories for easier look-up.

dbforge 4 SQL SERVER   Generating Meaningful Test Data with dbForge Data Generator for SQL Server

I was really impressed with the collection of generators the application offers. The tool automatically detects and offers appropriate generators for a given column.  Also, Data Generator for SQL Server includes multiple ways to make generated data more customized. For instance, you can:

  • Set the range of generated values for different SQL data types
  • Define the percentage of NULLs to be generated
  • Set the value distribution mode
  • Immediately preview data that will be generated

Data Generator suggested the following generators:

  • IDs for the ContactID Additionally, I selected the Random by timestamp value distribution mode for getting more complex numbers.
  • Honirific Prefix (Title) for the Title column
  • First Name (male) for the FirstName column
  • Last Name for the LastName column
  • Email for the EmailAddress This generator is based on the following regular expression \c{3,8}(\d{0,3}|.\c{3,10})@nowhere.com that can be customized right in the Fill settings section of Column generation settings. For instance, you can modify the regular expression, so that the domain name would be “mailservice.com” instead of “nowhere.com”.

dbforge 5 SQL SERVER   Generating Meaningful Test Data with dbForge Data Generator for SQL Server

Data Generator also allows to modify existing generators and create your own ones, so in future you won’t need to modify the same generator over and over again. I opened the Email.xml file, that is located at C:\Users\Public\Documents\Devart\dbForge Data Generator for SQL Server\Data Generators,  modified regular expression so that the domain name would consist of combination of letters, updated description, name, and saved the file as MyEmail.xml:

dbforge 6 SQL SERVER   Generating Meaningful Test Data with dbForge Data Generator for SQL Server

The following screenshot shows the data generation document with the result of my manipulations:

I was quite satisfied with the results showed in Preview, so I was ready to populate the target database with data. Data Generator offers several ways to do that, namely:

  • Open the data population script in the internal editor
  • Save the SQL script to a file
  • Run the script automatically

dbforge 7 SQL SERVER   Generating Meaningful Test Data with dbForge Data Generator for SQL Server

I chose the first option and in few seconds, I got the following population script:

dbforge 8 SQL SERVER   Generating Meaningful Test Data with dbForge Data Generator for SQL Server

To sum up, I would like to say that the tool has done a fantastic job for me. In my opinion, the most strong points of Devart’s Data Generator are:

  1. Simple work-flow in a convenient interface. With dbForge Data Generator, the process takes just couple of minutes and basically, it is simplified to three simple steps:
  • Selecting required tables
  • Selecting and customizing a required generator
  • Populating tables with test data
  1. Impressive collection of generators. The tool includes over 100 generators that allow to populate tables with realistic-looking test data.
  2. Flexible Customization Options. Developers put really much effort for making data more customized and suiting specific needs.

I strongly recommend to give dbForge Data Generator for SQL Server a try and to evaluate its functionality on your own by downloading a 30-day free trial.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – FIX – Server principal ‘Login Name’ has granted one or more permission(s). Revoke the permission(s) before dropping the server principal

There are parts of SQL Server where I always like to take expert advice. Sometimes a quick call to a friend can save you tons of time. When I see topics around AlwaysOn, though I know what the basic steps are to building and working with AlwaysOn, the hardcore troubleshooting is not the forte perse. But when I encounter questions around it, I put my learning hat to explore the possible reasons. There was a mail from one of the readers which has been on my Inbox for couple of weeks and I wanted to find the answer. His question goes like this:

While working with AlwaysOn availability group and cleaning it up, I am encountering below error while dropping a login.

DROP LOGIN Pinal
GO

Msg 15173, Level 16, State 1, Line 1
Server principal ‘Pinal’ has granted one or more permission(s). Revoke the permission(s) before dropping the server principal.

drop error 02 SQL SERVER   FIX – Server principal ‘Login Name’ has granted one or more permission(s). Revoke the permission(s) before dropping the server principal

Even trying from UI also gives same error

drop error 01 SQL SERVER   FIX – Server principal ‘Login Name’ has granted one or more permission(s). Revoke the permission(s) before dropping the server principal

Next task was to find what is causing the error, so I made below query which can tell the exact objects that are dependent and making our DROP command to fail.

SELECT class_desc,*
FROM sys.server_permissions
WHERE grantor_principal_id = (
SELECT principal_id
FROM sys.server_principals
WHERE NAME = N'Pinal')

SELECT NAME
,type_desc
FROM sys.server_principals
WHERE principal_id IN (
SELECT grantee_principal_id
FROM sys.server_permissions
WHERE grantor_principal_id = (
SELECT principal_id
FROM sys.server_principals
WHERE NAME = N'Pinal'))

drop error 03 SQL SERVER   FIX – Server principal ‘Login Name’ has granted one or more permission(s). Revoke the permission(s) before dropping the server principal

Based on output, it means there is an endpoint on which Pinal has been granted permission. Second result shows that I have used my account i.e. Pinal to create endpoint and granted permission to AppUser account using below command:

CREATE ENDPOINT [hadr_endpoint]
STATE
=STARTED
AS TCP (LISTENER_PORT = 5023, LISTENER_IP = ALL)
FOR DATA_MIRRORING (ROLE = ALL, AUTHENTICATION = WINDOWS NEGOTIATE
, ENCRYPTION = REQUIRED ALGORITHM AES)
GO
GRANT CONNECT ON ENDPOINT::[hadr_endpoint] TO AppUser
GO

As the error message suggested, I revoked permission using the below command:

REVOKE CONNECT ON ENDPOINT::[hadr_endpoint] TO AppUser
GO

After the above step, I was able to delete login. Yes, this was a great learning for me today. Do let me know if you ever encountered similar errors in your environments?

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Fix – Error: Msg 468, Level 16, State 9, Line 1

Ever since I have started writing about Error Messages a number of you ping me from time to time to understand why you are getting an error. As I always say, Error messages are a great way to learn concepts and we become better if we understand the reason behind every error one gets. As I scan through almost every single mail that comes in, some of them catch my attention and I tend to write this experience back as a blog post.

Last week, one fresher who was working for an European client wrote back to me with an error mentioned below:

Msg 468, Level 16, State 9, Line 15
Cannot resolve the collation conflict between “SQL_Latin1_General_CP1_CI_AS” and “French_CS_AI” in the equal to operation.

Though the error looked simple and self-explanatory, this kid wanted to mimic this error in his environment. He also added that he was using temp tables while this error surfaced. This statement got me interested and I wanted to show him with a simple script to how this error can be achieved.

Let me start by knowing the server collation to start with:

-- Server Collation
SELECT SERVERPROPERTY('Collation')

——————————————————————–

SQL_Latin1_General_CP1_CI_AS

As you can see, I am using the default collation on my server at this moment. Now let us create an DB with a different collation (French) for example. We will also create two tables post that. First will be a normal table while the second one will be an Temp table.

-- Create database with different collation
CREATE DATABASE [DB_Not_Contained]
CONTAINMENT
= NONE
COLLATE French_CS_AI
GO

-- Create 2 tables with 1 being a temp table so it goes to tempdb and uses the server collation and not the database collation
USE [DB_Not_Contained]
GO
CREATE TABLE [DemoCollation]
(DemoCollationNM VARCHAR(100))
GO
CREATE TABLE #DemoCollation
(DemoCollationNM VARCHAR(100))

-- Insert records into both tables
INSERT dbo.DemoCollation
(DemoCollationNM)
VALUES ('Test Join');
INSERT #DemoCollation
(DemoCollationNM)
VALUES ('Test Join');

Now that the basic setup of data is over. Let us make a simple join between these two tables.

-- Now query and try and join both tables having 2 different collations
SELECT p.DemoCollationNM
FROM dbo.DemoCollation p
INNER JOIN #DemoCollation d ON
p.DemoCollationNM = d.DemoCollationNM

Woot, we will receive the error message:

Msg 468, Level 16, State 9, Line 35
Cannot resolve the collation conflict between “SQL_Latin1_General_CP1_CI_AS” and “French_CS_AI” in the equal to operation.

Whenever you encounter this error message, please go ahead and check the collation of tables under question to be clear they are not different.

collation conflict 01 SQL SERVER   Fix   Error: Msg 468, Level 16, State 9, Line 1

So what are the learnings from these simple steps?

  • We cannot join tables, columns when there is conflict in collation between the objects
  • Temp table uses the same collation as our Server by default, so are the objects that are created.

I know I have given you a hint on how you might potentially mitigate this, but let me see if anyone can give me the solution to this problem and how have you ever been able to solve these? Have you ever encountered this error?

Reference: Pinal Dave (http://blog.sqlauthority.com)

Interview Question of the Week #028 – Few SQL Best Practices

Just last week, I moderated a SQL Server interview and I heard very interesting conversation.

Question – What are the best practices are you following for SQL Server?

When I heard this question – I thought candidate will answer many different things, but he was stunned and had nothing much to add to the conversation. That reminded me one of my older articles which I believe still adds quite a lot of value to this conversation. Here is the article as an answer to the question.

Answer – 

1. Know your data and business application well.
Familiarize yourself with these sources; you must be aware of the data volume and distribution in your database.

2. Test your queries with realistic data.
A SQL statement tested with unrealistic data may behave differently when used in production. To ensure rigorous testing, the data distribution in the test environment must also closely resemble that in the production environment.

3. Write identical SQL statements in your applications.
Take full advantage of stored procedures, and functions wherever possible. The benefits are performance gain as they are precompiled.

4. Use indexes on the tables carefully.
Be sure to create all the necessary indexes on the tables. However, too many of them can degrade performance.

5. Make an indexed path available.
To take advantage of indexes, write your SQL in such a manner that an indexed path is available to it. Using SQL hints is one of the ways to ensure the index is used.

6. Understand the Optimizer.
Understand the optimizer how it uses indexes, where clause, order by clause, having clause, etc.

7. Think globally when acting locally.
Any changes you make in the database to tune one SQL statement may affect the performance of other statements used by applications and users.

8. The WHERE clause is crucial.
The following WHERE clauses would not use the index access path even if an index is available.
E.g. Table1Col1 (Comparison Operator like >, >=, <=,) Table1Col2, Table1Col1 IS (NOT) NULL, Table1Col1 NOT IN (value1, value2), Table1Col1 != expression, Table1Col1 LIKE ‘%pattern%’, NOT Exists sub query.

9. Use WHERE instead of HAVING for record filtering.
Avoid using the HAVING clause along with GROUP BY on an indexed column.

10. Specify the leading index columns in WHERE clauses.
For a composite index, the query would use the index as long as the leading column of the index is specified in the WHERE clause.

11. Evaluate index scan vs. full table scan. (Index Only Searches Vs Large Table Scan, Minimize Table Passes)
If selecting more than 15 percent of the rows from a table, full table scan is usually faster than an index access path. An index is also not used if SQL Server has to perform implicit data conversion. When the percentage of table rows accessed is 15 percent or less, an index scan will work better because it results in multiple logical reads per row accessed, whereas a full table scan can read all the rows in a block in one logical read.

12. Use ORDER BY for index scan.
The SQL Server optimizer will use an index scan if the ORDER BY clause is on an indexed column. The following query illustrates this point.

13. Minimize table passes.
Usually, reducing the number of table passes in a SQL query results in better performance. Queries with fewer table passes mean faster queries.

14. Join tables in the proper order.
Always perform the most restrictive search first to filter out the maximum number of rows in the early phases of a multiple table join. This way, the optimizer will have to work with fewer rows in the subsequent phases of join, improving performance.

15. Redundancy is good in where condition.
Provide as much information as possible in the WHERE clause. It will help the optimizer to clearly infer conditions.

16. Keep it simple, stupid.
Very complex SQL statements can overwhelm the optimizer; sometimes writing multiple, simpler SQL will yield better performance than a single complex SQL statement.

17. You can reach the same destination in different ways.
Each SQL may use a different access path and may perform differently.

18. Reduce network traffic and increase throughput.
Using T-SQL blocks over Multiple SQL statements can achieve better performance as well as reduce network traffic. Stored Procedures are better over T-SQL blocks as they are stored in SQL Server and they are pre-compiled.

19. Better Hardware.
Better hard ware always helps performance. SCACI drives, Raid 10 Array, Multi processors CPU, 64-bit operating system improves the performance by a great amount.

20. Avoid Cursors.
Using SQL Server cursors can result in some performance degradation in comparison with select statements. Try to use a correlated sub query or derived tables if you need to perform row-by-row operations.

Reference: Pinal Dave (http://blog.sqlauthority.com)

PowerShell: How to Find Logical and Physical Processors on SQL Server?

Working on PowerShell scripts have been an interesting journey. I have written a couple of posts before on using WMI events. Was pondering on this thought and this blog is inspired by combining using of WMI with PowerShell. So what are we trying to solve?

When SQL Server changed its whole licensing process from processor based licensing to core based licensing, a lot of us were confused to how this has to be calculated. Recently, one of my DBA friends asked how can I find the number of logical processors (cores) on my SQL Server machine? Now this question got me thinking and the first place I always go to search is SQLAuthority – trust me on this. Once figured out there isn’t much on this topic, I thought to write a simple script which I can refer in the future.

$Computer = 'localhost'
$ErrorActionPreference = 'SilentlyContinue'
$Error.Clear()
$ProcessorConfig = Get-WmiObject -class Win32_Processor -computername $Computer -namespace root\CIMV2 | Select PSComputerName, Name, NumberOfCores, NumberOfLogicalProcessors
write-Debug $Error.Count
If ($Error.Count -gt 0)
{
$ProcessorConfig = New-Object psobject
$ProcessorConfig | Add-Member -type NoteProperty -name ComputerName ("$Computer-failed to connect")
$ProcessorConfig | Add-Member -type NoteProperty -name Name -value 'Unable to get ProcessorInfo'
$ProcessorConfig | Add-Member -type NoteProperty -name NumberOfCores -value $null
$ProcessorConfig | Add-Member -type NoteProperty -name NumberOfLogicalProcessors -value $null
Write-Debug "Failed to connect to $Computer"
}
$ErrorActionPreference = 'Continue'
$ProcessorConfig | FT * -AutoSize

When you run this above script inside PowerShell, we will get to see an output like below:

logical physical processor 01 PowerShell: How to Find Logical and Physical Processors on SQL Server?

As you can see I have a 4 core processor and have enabled HT (Hyper-Threading) which makes the number of Logical Processor to 8. I personally felt this was a quick way to learn PowerShell when you have a typical problem in hand.

If you have read this far, please let me know how you have used WMI or PowerShell to solve interesting problems. Let me know over the comments and I will surely try to publish a few more like these in the future too based on your inputs.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Puzzle – DISTINCT Values in NVARCHAR

The technique used in this blog is explained in earlier blog SQL SERVER – Creating Dataset Using VALUES Clause Without Creating Table. To read more about Collation and Unicode refer to MSDN. This topic in general is very deep and complicated sometimes that you get to learn something every single time.

While playing with SSIS package, I was getting unique constraint error while data were getting inserted in SQL Server. Later I did further digging and found that it was due to German characters. Here is a simplified version of the problem. Will you be able to crack the puzzle?

You can use the SQL Server Management studio and run below three queries:

SET NOCOUNT ON
SELECT DISTINCT
columny COLLATE SQL_Latin1_General_CP1_CI_AS AS 'First'
FROM (VALUES ('ß'),('SS') ) tablex(columny) ;
SELECT DISTINCT columny COLLATE Latin1_General_CI_AS AS 'Second'
FROM (VALUES ('ß'),('SS') ) tablex(columny) ;
SELECT DISTINCT CAST(columny AS NVARCHAR) COLLATE SQL_Latin1_General_CP1_CI_AS AS 'Third'
FROM (VALUES (N'ß'),(N'SS') ) tablex(columny) ;

If you are not having access to SQL Server handy – here is a sample output for them:

ss puzzle 01 SQL SERVER   Puzzle   DISTINCT Values in NVARCHAR

Do you know the reason why ‘ß’ and ‘SS’ are same in nvarchar? Please comment and let me know. I will share surprise learning voucher with you.

Hope we will learn something together.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Enhancing Reference Data in Master Data Services – Notes from the Field #089

[Note from Pinal]: This is a 89th episode of Notes from the Field series. Master Data Services is one of the most important, but a very little explored feature of SQL Server. If you have been reading this blog, when it is about BI, I always take help from LinchPin People who are BI experts. I requested Reeves from LinchiPin people to help me answer this unique question.

In this episode of the Notes from the Field series database expert Reeves Smith explains why one should add referencing data to Master Data services to our enterprise. Read the experience of Reeves in his own words.


Reeves Smith SQL SERVER   Enhancing Reference Data in Master Data Services   Notes from the Field #089

In a previous post, I talked about adding reference data to Master Data Services and how it can enhance your Enterprise Data Warehouse. Through the Master Data Services add-in for Excel, I added reference data into a Master Data Services model. This model was created with data types that were inferred from the given values that were imported into the model. All of the data types were of basic types like strings or numeric.

In this post, I’m going to update the data model we created in the previous posts and add domain values to the column’s data types. Domain values are a way to restrict a column’s values to a defined list. This also provides a drop down for the column that restricts any free form text.

The steps are:

  • Create two entities that will be used for the domain values
  • Update the Customers entity to use the newly created entity

All of these step will be performed within Excel using the Master Data Services add-in for Excel.

Using the Demo model, created in the last post, we will update the MaritalStatus column with domain values.

Step 1 – Create Additional Entities

We will start with the MaritalStatus column of the Customers entity and create an additional entity called MaritalStatus. This entity will be used for the domain values in the MaritalStatus column of the Customers entity.

  1. Create an Excel worksheet with the reference data for MaritalStatus and provide a header for each of the columns in the data set. In this example we use Code and Name.

n 89 1 SQL SERVER   Enhancing Reference Data in Master Data Services   Notes from the Field #089

Figure 1 – MaritalStatus Excel worksheet data

  1. Select the data including the headers and click the Create Entity button from the Master Data tab in Excel.
  2. Enter the data for the following items:
    1. My data has headers: Checked
    2. Model: Demo
    3. Version: VERSION_1
    4. New entity name: MaritalStatus
    5. Code: Code
    6. Name (optional): Name

n 89 2 SQL SERVER   Enhancing Reference Data in Master Data Services   Notes from the Field #089

Figure 2 – Create Entity Dialog

  1. Click OK.

Step 2 – Update the Customers Entity

  1. Connect to the Customers Entity.

If you need help connecting to the Customer entity follow the below steps:

  1. From the Master Data tab click the Connect button and connect to the MDS instance with the Demo model.

n 89 3 SQL SERVER   Enhancing Reference Data in Master Data Services   Notes from the Field #089

Figure 3 – Connecting to MDS instance

  1. Select the Demo model and double click on the Customers Entity.

n 89 4 SQL SERVER   Enhancing Reference Data in Master Data Services   Notes from the Field #089

Figure 4 – The Master Data Explorer dialog

  1. Select the any cell in the MaritalStatus column of the Customer Entity.
  2. Click the Attribute Properties button.
  3. Change the Attribute type to Constrained list (Domain-based) and select the MartialStatus entity for Populate the Attribute with values from:

n 89 5 SQL SERVER   Enhancing Reference Data in Master Data Services   Notes from the Field #089

Figure 5 – Attribute Properties Dialog

Notice that the column has a different look with a constrained list and if a cell is selected, a drop down with the available values is now present. This keeps the users from adding values outside of the list.

n 89 6 SQL SERVER   Enhancing Reference Data in Master Data Services   Notes from the Field #089

Figure 6 – MaritalStatus as a Constrained List

Creating Additional Entities

Adding domain values to the other columns would require the same process.

Note: In the current dataset Education and Occupation does not have code values. One method to solve this creates an entity where the code and name contain the same values, see Figure 7.

n 89 7 SQL SERVER   Enhancing Reference Data in Master Data Services   Notes from the Field #089

Figure 7 – Education Entity

Summary

In the previous articles on Master Data Services we added reference data to a Master Data Service model. This data can be maintained through the web interface and Excel add-in by the subject matter experts (SME). By creating a robust model the SMEs are better able to create and maintain quality data.

Hope this helps.

If you want to get started with BIML with the help of experts, read more over at Fix Your SQL Server.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Process ID X was killed by hostname ABC, host process ID Y

Errorlogs are an awesome place to learn something new every time. It springs up with some new errors which you might have not seen ever. I am a big believer of the fact that one needs to monitor and take a look at the logs from time to time. It is the best way to mitigate problems before they occur. Recently I noticed the below error in SQL Server ERRORLOG. I wanted to know the exact meaning of each and every placeholder in error message. Since I found it, I am sharing with you.

Error number: 18100

SELECT TEXT, severity, is_event_logged
FROM  sys.messages
WHERE message_id = 18100
AND   language_id = 1033

kill 01 SQL SERVER   Process ID X was killed by hostname ABC, host process ID Y

Process ID %d was killed by hostname %.*ls, host process ID %d.
As per my understanding %d stands for number and %ls stands for string.

Whenever a KILL command is issued in SQL Server, it is logged into SQL Server Errorlog.

  1. First placeholder would be the SPID which was KILL’ed.
  2. Second placeholder is the machine name from where the KILL command was issued. Note that host name can be obfuscated and not guaranteed to be exact name of the machine. While making connection, we can pass our own host name.
  3. Third placeholder is the client Process ID which has issued the kill command. This can be seen via Task Manager.

If we look at Errorlog, we can also find the SPID which has issued the kill command.

2015-07-14 05:00:00.290 spid52       Process ID 51 was killed by hostname BIGPINAL, host process ID 10044.
Killed by : SPID52
Killed : 51
Host Name – BIGPINAL
PID in Task Manager – 10044

kill 02 SQL SERVER   Process ID X was killed by hostname ABC, host process ID Y

I have seen few DBA putting schedule job to monitor blocking and kill it. In that case you would see Process ID from SQL Agent. One of the interesting piece which I was not able to answer was the question asked by a blog reader on http://blog.sqlauthority.com/2007/06/26/sql-server-explanation-and-example-four-part-name/

I am getting messages in errorlog as Process ID 400 was killed by hostname , host process ID 0.
there is no hostname , and process ID 0, when i checked for process id in task manager , 0 is for system idle process.

This made me think and I spoke to few local SQL Server Experts and they asked me the exact message in ERRORLOG and I got below as reply

2015-07-01 01:22:45.600 spid21s      Process ID 51 was killed by hostname , host process ID 0.

This means that system SPID 21s has killed the process 51 and since this is a system SPID, we would not see host name and host process.

kill 03 SQL SERVER   Process ID X was killed by hostname ABC, host process ID Y

Can you think of a way to reproduce this behavior and get similar message in SQL ERRORLOG where hostname is empty and host process ID is zero? If you can write the same over comments, it would be of great help for others too.

Reference: Pinal Dave (http://blog.sqlauthority.com)