SQL SERVER – What is is_not_trusted in sys.foreign_keys?

My greatest source of learning is via my blog. Many of you take time in asking your toughest questions and sometimes basic questions that make me think. This interaction creates a source of exploration for me and ultimately translates into my knowledge sharing. All questions asked by DBAs are not straight forward and sometimes I also have to do some research before providing answer to them. Here is another interesting question that I received via email recently -

Hi Pinal, What is the meaning of is_not_trusted column in sys.foreign_keys?

I would have told to check books online but if we look at books online for sys.foreign_keys, the column says “FOREIGN KEY constraint has not been verified by the system.” This is not very clear and easy to understand. No wonder he asked me this questions. Let us learn this using an example:

For simplicity let us assume we have three tables: EmpMaster, AddressMaster and EmpAddress using the below schema:

CREATE DATABASE SQLAuthDB
GO
USE SQLAuthDB
GO
CREATE TABLE EmpMaster(Emp_id INT PRIMARY KEY, FName VARCHAR(10), LName VARCHAR(10))
GO
CREATE TABLE AddressMaster(AddressId INT PRIMARY KEY, Address VARCHAR(100), Status CHAR(1))
GO
CREATE TABLE EmpAddress(Emp_id INT, AddressId INT)
GO

From the name you might have guessed that EmpAddress is the mapping table which can have only those values which are in master tables. So, to enforce referential integrity we should create foreign keys to avoid violation of parent-child relationship.

ALTER TABLE dbo.EmpAddress ADD CONSTRAINT
FK_EmpLocation_EmpMaster FOREIGN KEY
(
Emp_id
) REFERENCES dbo.EmpMaster
GO
ALTER TABLE dbo.EmpAddress ADD CONSTRAINT
FK_EmpLocation_AddressMaster FOREIGN KEY
(
AddressId
) REFERENCES dbo.AddressMaster
GO

We have set the basic structure to work. Now, let us insert one row in each master table as below.

INSERT INTO EmpMaster VALUES (1, 'Pinal','Dave')
GO
INSERT INTO AddressMaster VALUES (1, 'Bangalore, India', 'A')
GO

Now, if we try to insert a value (1, 2) in child table, we would get error

INSERT INTO EmpAddress (Emp_id, AddressId) VALUES (1, 2);

Msg 547, Level 16, State 0, Line 33

The INSERT statement conflicted with the FOREIGN KEY constraint “FK_EmpLocation_AddressMaster”. The conflict occurred in database “SQLAuthDB”, table “dbo.AddressMaster”, column ‘AddressId’.

The statement has been terminated.

This error is expected because “2” is not a valid value for AddressId because it’s not available in parent table i.e. AddressMaster.

Let’s look at metadata for foreign key in the catalog view sys.foreign_keys

SELECT name, is_not_trusted FROM sys.foreign_keys

Is_not_trusted is set to zero by default which means that child is not having data which is not in parent. Now, let’s use below to disable the constraint.

ALTER TABLE [dbo].[EmpAddress] NOCHECK CONSTRAINT [FK_EmpLocation_AddressMaster]

And now insert the value.

INSERT INTO EmpAddress (Emp_id,AddressId) VALUES (1, 2)

This means, now we have inconsistent value in the child table. Executing the query on sys.foreign_keys again will confirm this assumption:

SELECT name, is_not_trusted FROM sys.foreign_keys

As highlighted above, the value 1 in the column means that system has not verified the consistency of the data. Now, let’s try to enable the constraint again.

ALTER TABLE [dbo].[EmpAddress] WITH CHECK CHECK CONSTRAINT [FK_EmpLocation_AddressMaster]
GO

You might say that above looks incorrect but “Check” twice is intended. That’s the correct syntax to enable the constraint. Here is the error one would receive while enabling it.

Msg 547, Level 16, State 0, Line 59

The ALTER TABLE statement conflicted with the FOREIGN KEY constraint “FK_EmpLocation_AddressMaster”. The conflict occurred in database “SQLAuthDB”, table “dbo.AddressMaster”, column ‘AddressId’.

Modify the value as below.

UPDATE EmpAddress
SET AddressId = 1
WHERE AddressId = 2

And now we can re-execute the ALTER TABLE command and it should work.

Once you are done with the above script, you can clean up the database by executing following script:

USE MASTER;
DROP DATABASE SQLAuthDB;

So this was a quick overview of the is_not_trusted value in the foreign_keys table and how it functions. Do let me know if you have every used this feature anywhere.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Iterating over Metadata with BimlScript – Notes from the Field #061

[Note from Pinal]: This is a 36th episode of Notes from the Field series. One of the common question I receive every other day is – I have learned BI but I feel that there is no automation in the BI field, what should I do? How do I go beyond learning BI? How can I fill the gap between BIDS and SSDT? If you have been reading this blog, when it is about BI, I always take help from LinchPin People who are BI experts. I requested Reeves from LinchiPin people to help me answer this unique question.

In this episode of the Notes from the Field series database expert Reeves Smith explains what is Biml and How Can it Help Me with SQL Server BI?. Read the experience of Reeves in his own words.


In previous articles (What is Biml and The Why of Biml), I’ve discussed Business Intelligence Markup Language (Biml) and given some examples of projects that would benefit from its use. I also demonstrated a quick way to get introduced to Biml without learning a lot of C#. As you grow in your understanding of Biml and BimlScript you will want to make your Biml files more dynamic with a programmatic metadata-driven approach. This article will build on the previous articles and add looping with metadata to build more dynamic Biml projects.

Overview

Starting where the previous article left off, I’ll demonstrate how to add a little C# to make a more automated method of metadata retrieval. If you did not walk through the other articles, I would suggest that you review them before you read this one. This article builds on the Biml in Listing 1 and lets you create multiple integration services packages from one Biml file.

Listing 1 – Semi-Automated Common Source Package Biml

Example – Automated Source Packages

This example shows you how to connect to a database and select the metadata to iterate over. Then you can create multiple SQL Server Integration Services (SSIS) packages from one Biml file. The example in the previous article manually updated the metadata and reran each Biml file. This is an effective approach, but you can build on it to create a more robust solution. Take a look at the C# code in Listing 2.

Listing 2 – C# Replacement Code

Replace the first line of code from Listing 1 with the code in Listing 2, which is an excerpt from Listing 3.

  • The first line will add the namespace to the Biml file and enable the calling of functions without a fully qualified name.
  • The second line creates a string variable that contains a connection string of the database with the metadata.
  • The third line creates a connection object (actually an AstOleDbConnection node) that will be used by the ImportDB function.
  • The forth line creates a result set with the metadata from the connection object. This object contains all of the metadata from the AdventureWorks2012 database. The ImportDB function has three parameter. The first parameter is a schema filter, and in the example the filter is on the Person schema. The second parameter is a table filter and is blank in the example, so it will be ignored. The last parameter is a list of ImportOptions, and the views have been excluded. To add more import options use the “|” followed by additional options. Example of multiple options:

ImportOptions.ExcludeForeignKey | ImportOptionsExcludeViews

The example includes an object that contains all of the metadata that you will loop over. You have to place the loop within the <Packages> node because a Biml file can only have one Packages collection. The foreach loop will create multiple <Package> nodes.

A tableName string was built below the foreach loop to reduce the code that you have to alter from the example in the previous article.

Listing 3 – Complete Code

Note: The code in listing 3 has the ConnectionString truncated to save space.

If you receive the error shown in Figure 1, it means that the tables do not exist in the stage database. To check this, add the Person filter to the table parameter in the ImportDB function and return metadata for only one table. The prior articles used this table, so it should exist.

Figure 1 – Error Message

Summary

This article added some more C# to help you refine your Biml files. I hope you can start to see the power of Biml and how it can help remove the repetitive work from your next Integration Services project.

If you want to get started with BIML with the help of experts, read more over at Fix Your SQL Server.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Identify and Filter In-Memory Optimized Tables – SQL in Sixty Seconds #079

Earlier I had written a blog about SQL SERVER – Beginning In-Memory OLTP with Sample Example which covers the basics of working with In-Memory OLTP. Though that post gets you started, one of my colleague asked me if there was an easier way to identify In-Memory Tables when working with SQL Server Management Studio. As a follow up I wrote another blog post over here where I demonstrate the same with images and query over here: SQL SERVER – Filter In-Memory OLTP Tables in SSMS.

I have converted the same blog post in SQL in Sixty Seconds video over here.

Let me know your opinion about it.

Here are few other blog posts related to this concept, which I have written earlier.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Schema Change Reports – SQL in Sixty Seconds #078

Earlier, I wrote a blog post about Schema Change Reports and Finding Tables Created Last Week – DBA Tip. I received quite a few emails about the same. It seems like a quite a popular topic. As per few requests I have created a very short 60 second video on this subject.

Let me know your opinion about it.

Here are few other blog posts related to this concepts, which I have written earlier.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Finding Tables Created Last Week – DBA Tip

My conversations with my DBA friends are always interesting. They have unique ways to solve problems and organizations take them for granted. Most of the DBA’s want to be in control of their environment. There is no second thought on it when it comes to delivery from their side.

In a recent conversation, one of my friend asked me if there was a way to identify the tables that were created in the database in the past one week. He was asked by his security team to bring a report on when an object is created in the production server. They want to make sure no unnecessary objects are getting created in the database without prior approval.

Though my thought process was to build an audit for this, I thought of sending a simple script that can be used to identify the same.

SELECT o.name AS [Object_Name],
s.name [Schema_Name],
o.type_desc [Description],
o.create_date [Creation_Date],
o.modify_date [Modified_Date]
FROM   sys.all_objects o
LEFT OUTER JOIN sys.schemas s
ON o.schema_id = s.schema_id
WHERE  create_date > (GETDATE() - 7) OR modify_date > (GETDATE() - 7)

A sample output is shown below:

As you can see, the query can be added to a SQL Server job and we can send the same as an email to the DBA or security team automatically. The other option I gave my friend was to run an SSRS report and schedule a report output via email to the Security team.

If you are person not doing such hi-fi stuff of sending reports, then there can be a much easier way to find this information. We can use the “Schema Change History” report available inside SQL Server Management Studio to get this information.

A typical output is shown above. This is similar to the query output we discussed before but in a built-in report that is out-of-box. Most of these SQL Server Management Studio tips are simple and are hidden in some shape or form.

Do let me know if you every have had the need for such requirements? What did you do in that case? Would love to hear your experience and implementation for sure.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Filter In-Memory OLTP Tables in SSMS

Earlier I had written a blog about SQL SERVER – Beginning In-Memory OLTP with Sample Example which covers the basics of working with In-Memory OLTP. Though that post gets you started, one of my colleague asked me if there was an easier way to identify In-Memory Tables when working with SQL Server Management Studio. Also my friend said, their database had 1000’s of tables and it will be impossible to look at any icon difference or something that is tough to see.

This statement got me thinking and I wanted to give him few options. On exploring SSMS, you will be surprised all these exist right in front of our eyes.

T-SQL Way

The simplest solution here was to use the standard system table and query for just the In-Memory tables. The below query will output only In-Memory tables present in the selected database where this query is being run.

SELECT * FROM sys.tables
WHERE is_memory_optimized = 1

Though these age old methods exits, these still don’t help us much when it comes to visualizing the same in SSMS.

Using Object Explorer

Method 2 is using the Object explorer -> Filter settings. In SQL Server Management Studio of SQL Server 2014, we get a nice little option to filter based on In-Memory tables. Right click the Tables node and select Filter Settings. Now in this Dialog we have additional settings that are useful.

Using Object Explorer Details

The 3rd and simple way is to use the Object Explorer Details (F7) inside SQL Server 2014 SSMS. On the header, right click to add “Memory Optimized” Column. This will show True or False values. Sort on this column and we get all the In-Memory Optimized tables at the top.

I thought these were simple yet powerful tip when working with these special tables inside SQL Server 2014. Do let me know if you have any other method other than this to filter out in-Memory Optimized tables inside SSMS.

Reference: Pinal Dave (http://blog.sqlauthority.com)

Interview Question of the Week #001 – Script to List Foreign Key Relationships and Constraint Name

Two years ago I wrote a book SQL Server Interview Questions and Answers with my very close friend Vinod Kumar. It is a very popular book of mine and has sold many thousands of copies so far. Every single day I still get quite a many different interview questions in email and users wants me to answer them. I have been collecting them for quite a while and in this new year, I am starting this new series, which will take a one interview question sent by readers and will try to answer in short and simple words. The goal of this series is to build a knowledge on the subject. If you want to pass any interview, you should know much more than this question and answers. You should have real world experience. Read this blog post over here before you continue reading this story.

Well, let us start this series. In this week’s episode we will try to answer following question -

Question: Write a script to display foreign key relationships and name of the constraint for each table in the database for SQL Server 2005 and later version.

Answer: Here is the script which will display foreign key relationships and name of the constraint for each table in the database.

SQL Server 2005 and later version

SELECT f.name AS ForeignKey,
OBJECT_NAME(f.parent_object_id) AS TableName,
COL_NAME(fc.parent_object_id,
fc.parent_column_id) AS ColumnName,
OBJECT_NAME (f.referenced_object_id) AS ReferenceTableName,
COL_NAME(fc.referenced_object_id,
fc.referenced_column_id) AS ReferenceColumnName
FROM sys.foreign_keys AS f
INNER JOIN sys.foreign_key_columns AS fc
ON f.OBJECT_ID = fc.constraint_object_id

Reference Link

If you execute the above query with Adventure sample database, it will give following result set.

SQL Server 2000

SELECT
K_Table = FK.TABLE_NAME,
FK_Column = CU.COLUMN_NAME,
PK_Table = PK.TABLE_NAME,
PK_Column = PT.COLUMN_NAME,
Constraint_Name = C.CONSTRAINT_NAME
FROM INFORMATION_SCHEMA.REFERENTIAL_CONSTRAINTS C
INNER JOIN INFORMATION_SCHEMA.TABLE_CONSTRAINTS FK ON C.CONSTRAINT_NAME = FK.CONSTRAINT_NAME
INNER JOIN INFORMATION_SCHEMA.TABLE_CONSTRAINTS PK ON C.UNIQUE_CONSTRAINT_NAME = PK.CONSTRAINT_NAME
INNER JOIN INFORMATION_SCHEMA.KEY_COLUMN_USAGE CU ON C.CONSTRAINT_NAME = CU.CONSTRAINT_NAME
INNER JOIN (
SELECT i1.TABLE_NAME, i2.COLUMN_NAME
FROM INFORMATION_SCHEMA.TABLE_CONSTRAINTS i1
INNER JOIN INFORMATION_SCHEMA.KEY_COLUMN_USAGE i2 ON i1.CONSTRAINT_NAME = i2.CONSTRAINT_NAME
WHERE i1.CONSTRAINT_TYPE = 'PRIMARY KEY'
) PT ON PT.TABLE_NAME = PK.TABLE_NAME
---- optional:
ORDER BY
1,2,3,4

Reference Link

If you have a better answer, please leave a comment and I will include the answer with due credit.

Reference: Pinal Dave (http://blog.sqlauthority.com)