SQL SERVER – The Basics of the SSIS Script Task – Notes from the Field #065

[Note from Pinal]: This is a new episode of Notes from the Field series. SQL Server Integration Service (SSIS) is one of the most key essential part of the entire Business Intelligence (BI) story. It is a platform for data integration and workflow applications.

In this episode of the Notes from the Field series I asked SSIS Expert Andy Leonard a very crucial question – What are the basics of the SSIS script task and where do we start with it? Andy was very kind to answer the questions and provides plenty of information about how a novice developer can learn SSIS from the beginning and become expert in the technology.


Learning how to configure and use the SQL Server Integration Services (SSIS) Script Task gives you a great starting point for learning how to use SSIS Control Flow tasks. You’ll find that SQL Server Control Flow tasks provide powerful development capabilities that may surprise you. Let there be no mistake, SSIS is a software development platform. With “SQL Server” included in the name, it is easy for people to confuse SSIS as a database tool or accessory, but Control Flow Tasks put that confusion to rest.

SSIS provides several Control Flow tasks. Here is a list that provides a good approximation of which tasks I use most, from most-used to least-used:

This article will focus on a basic example of configuring the SSIS Script Task, shown in Figure 1. People often think the SSIS Script Task is just for generating log messages, but let’s look at some of its more powerful capabilities.

Figure 1: SSIS Script Task

The Script Task supports Microsoft Visual C# and Microsoft Visual Basic languages. You can choose which language you will use prior to clicking the Edit Script button. Once you’ve clicked the Edit Script button, you cannot change the Script Language property.

At the heart of the Script Task is the ability to read a variable value from the SSIS package into the script and then write a message out of the Script Task. Because the Script Task can interact with SSIS variables, you can use .Net code to manipulate and respond to variable values.

For the purpose of our demonstration, we will add code to the Script Task to create a message for SSIS logging.

Messaging is a key component of enterprise architecture. Messages are generated (published, raised, produced, etc.) by Senders and received (subscribed, consumed, requested, etc.) by Listeners. Messaging is a complex topic. Entire books are dedicated to the topic.

Add variables to the ReadOnlyVariables and ReadWriteVariables properties by clicking the ellipsis inside the value textbox for each property. When you click the ellipsis, a Select Variables dialog appears, as you see in Figure 2.

Figure 2: Adding Read Only Variables to the Script Task

SSIS variables use a two-part naming convention: <Namespace>::<VariableName>. I added the SSIS variables System::TaskName and System::PackageName. The selected variables are in the System namespace and contain the name of the task and package, respectively.

Click the Edit Script button to open the script editing window as shown in Figure 3:

Figure 3: The Script Task Script Editor

Add the following code beneath the instruction to Add your code here:

Dim sPackageName As String = _ Dts.Variables("System::PackageName").Value.ToString
Dim sMsg As String = "I am " & sPackageName
Dts.Events.Fireinformation(1001, sPackageName, sMsg, "", 0, True)

Listing 1: VB.Net Code

The first line of the code in Listing 1 creates a VB.Net variable named sPackageName and initializes the value of this variable, setting it to the value of the SSIS Variable named System::PackageName. This is one way to read SSIS Variable values into a Script Task’s script.

The second line of the code in Listing 1 creates a VB.Net variable named sMsg and initializes the value of this variable with the text, “I am” plus the value contained in the sPackageName VB.Net variable.

The third line of the code shown in Listing 1 calls the FireInformation method in the Dts.Events assembly. FireInformation accepts six arguments:

  1. InformationCode – an integer you can use to identify custom messages sent from the SSIS package.
  2. SubComponent – a string you can use to populate the source of the message.
  3. Description – a string containing the message you wish to send.
  4. HelpFile – a string containing the location of a Help file (should a help file exist).
  5. HelpContext – an integer pointing to the help context (should a help context exist).
  6. FireAgain – a Boolean value indicating whether the event should subsequently fire.

My VB code includes a reference to the FireAgain argument. Note: This MSDN post states: “Because firing of an event may be expensive, the run-time engine provides a mechanism for suppressing events that you are not interested in. Every event firing method has a FireAgain parameter. If the value of this variable is false, after the method returns, the caller will not fire this event again for the duration of the current execution.” I understand this to mean setting FireAgain to False would suppress future FireInformation events in my Script Task, but testing indicates this is not correct. This is important for you to know because you may opt to use the FireAgain argument to manage when you raise events. I do not use the FireAgain argument for this purpose. I typically manage the decision to raise events in conditional logic surrounding the Event-raising code.

A test execution of the package reveals an Information message recorded on the Progress / Execution Results tab as shown in Figure 4.

Figure 4: the Information Message on the Progress / Execution Results Tab

Information messages generated from Script Tasks will appear in SSIS logs and the SSIS Catalog. With this information, you can monitor the progress of SSIS execution while the package is running, or view the historical status of variable values or package messages after the package execution has completed.

The SSIS Script Task can accomplish much more than generating log messages. This brief introduction and basic example have demonstrated how to get started configuring and using the SSIS Script Task. As you can see, SSIS Script Tasks give you development capabilities you may not have been aware of.

If you want to get started with SSIS with the help of experts, read more over at Fix Your SQL Server.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – The DBA goes to MARS – Maintenance, Architecture, Research and Support – Notes from the Field #063

[Note from Pinal]: This is a 63rdth episode of Notes from the Field series. Storytelling is an art very few have mastered in their life. When I requested Stuart to share his ideas about his famous blog about MARS, I really had not expected the response which I have received in this episode of Notes from the field. It is very critical for any DBA to be proactive in their operation, however are we proactive enough? Well, you need to read this entire episode of notes from the field to grasp the concept.

In this episode of the Notes from the Field series database expert Stuart Ainsworth explains about MARS in this very captivating story.


Database administration can be a complex job, particularly in large scale environments; DBA’s are responsible for a variety of tasks, and there’s a wealth of knowledge on how to administer and support SQL Server.  For this post, however, I wanted to focus not on a specific code-related or configuration tip, but rather on a broader perspective of work in general.  As a manager and a consultant, most SQL Server shops have a common problem: we focus too much on reacting to problems and not enough on preventing them.

In the past, I’ve blogged about what I call the MARS paradigm: Maintenance, Architecture, Research and Support.  Most system administration tasks can be broadly categorized into one of these four areas, and each of these four areas can be conceptually defined in terms of Actions (Proactive\Reactive) and Results (Delayed\Immediate):

Think about any given day, and how much time you spend working in each of the 4 areas; which ones are given priority?  If you’re like most shops, you probably spend the most amount of time working support cases (“my query is slow”, or “can you show me where to find this information in the database?”) instead of focusing on maintenance or architecture.   A healthy IT shop will find ways to balance the need for immediate reaction to the need for proactive long-term results; however, the question that needs to be answered is: why do we react, and how do we become more proactive?

Why Database Administration Is Usually Reactive:

In part, the reactive nature of most IT departments is a cultural thing.  Fighting fires can be fun; people who solve problems are usually lauded as heroes, while people who prevent problems often go unnoticed.  Think about it this way; if a database is corrupted and a restore is required, who gets the credit for saving the day?  The DBA who did the restore or the DBA who set up the backup plan years ago (assuming that they’re not the same DBA)?  Administrators that are quick to act are viewed as being responsive, even though they’re usually built on a foundation that was established invisibly some time ago.

Furthermore, most database administrators are thorough in their work; support issues are usually an opportunity for them to dig into a problem and find the best solution.  Maintenance issues, on the other hand, are addressing problems that don’t yet exist, and therefore don’t yet need a solution.  However, constantly providing support pulls resources away from building a better foundation.  If a DBA is spending time constantly fighting fires, they’re not spending time on monitoring the environment to avoid future problems.   So how do you move from a reactive environment to a proactive environment?

Moving From a Reactive Approach to a Proactive Approach:

First, track your work efforts for some period of time (30 days).  Find a way to categorize your work into one of the four broad areas defined above:

  • Maintenance: efforts needed to keep system performance and stability
  • Architecture: efforts to add new features, functionality, or capability
  • Research: efforts to learn “something new”
  • Support: efforts to sustain the existing functionality or customer requests

Ideally, your work is being tracked in some form of service desk software, but if not, then I’d suggest you start with something simple (like a whiteboard with sticky notes).

Second, figure out a metric for work effort; it could be time spent on an issue, or the number of issues, or some other measure of complexity.  The goal is to have a reasonable method of comparison between categories.  Based on your collection efforts, you should be able to figure out where you’re spending most of your time; if more than 50% of your time is being spent in Support or Research, it’s time to consider becoming more proactive.

Becoming more proactive can be a never-ending journey, but I have a few suggestions on where to start:

  1. Prioritize accordingly. Not every support case deserves to be treated as your top priority; make sure that maintenance and architecture issues are included in your escalation plans.
  2. Set it, but don’t forget it. SQL Server offers a lot of tools for automating maintenance; make sure that you allocate time to review successful executions as well as failures.  Understanding when a system is performing outside of spec is nearly as critical as being notified when it fails.
  3. Communicate with interested parties. Let your manager and your peers know about your maintenance an architecture efforts; the best metric would be a reduction in support calls and cases.

If you want to get started with SQL Server with the help of experts, read more over at Fix Your SQL Server.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – The Easy Functions of SQL Server – Notes from the Field #062

[Note from Pinal]: In this episode of the Notes from the Field series database expert Kathi Kellenberger explains about easy and amazing functions of SQL Server. Kathi is an amazing instructor, she was the SQL author I have read in my early career. The reason, I love SQL Server because her writing has instigated love for this technology in me. Today she brings a relatively unknown topic for database experts. Read the experience of  Kathi in her own words.


Have you ever tried to format a datetime value in T-SQL? I have seen some pretty crazy code that looked something like this:

DECLARE @date DATETIME = GETDATE();
SELECT CAST(YEAR(@date) AS CHAR(4)) + '/' +
RIGHT('0' + CAST(MONTH(@date) AS VARCHAR(2)),2) + '/' +
RIGHT('0' + CAST(DAY(@date) AS VARCHAR(2)),2) + ' ' +
RIGHT('0' + CAST(DATEPART(HOUR,@date) AS VARCHAR(2)),2) + ':' +
RIGHT('0' + CAST(DATEPART(MINUTE,@date) AS VARCHAR(2)),2);

Starting with SQL Server 2012, there is an even easier method: The FORMAT function. The FORMAT function, can do the same thing as the previous code, but it is so much easier to write. The FORMAT function is one of five functions added with SQL Server 2012 that I like to call The Easy Functions. The Easy Functions are listed in the table below.

The Easy Functions
Category Function Definition
Logical CHOOSE Select one expression from a list
  IIF Inline IF. Evaluate an expression and substitute another expression for true or false
String CONCAT Add strings together
  FORMAT Format a string including “C” for currency and “P” for percent
Date EOMONTH Returns the last day of the month

The following script demonstrates the logical functions.

USE AdventureWorks2014; --Or 2012
GO
SELECT  CustomerID, COUNT(*) AS OrderCount,
CHOOSE(COUNT(*), 'Bronze','Silver','Gold','Platinum') AS MemberLevel,
AVG(TotalDue) AS AvgOrder,
IIF(AVG(TotalDue) > 1000, 'High','Low') AS Priority
FROM Sales.SalesOrderHeader
WHERE CustomerID IN (11060, 11062, 11065, 11068, 11070, 12165)
GROUP BY CustomerID;

The query returns the count of orders for several customers. A MemberLevel is assigned based on the number of orders the customer has placed, and a Priority is assigned based on the average order. If the average order is over $1000, then the customer has a high priority.

Take a look at the CHOOSE expression. The first argument is an integer, the count of the orders. After the order count, there is a list of values. The list of values is actually a 1-based array, and the first argument is an index into the array.

The IIF function requires three arguments. The first argument is an expression to check, the average order. The second argument is a value to return if the expression is TRUE. The third argument is a value to return for FALSE.

The next example demonstrates the string functions.

SELECT C.CustomerID, P.FirstName, P.MiddleName, P.LastName,
CONCAT(P.FirstName, ' ' + P.MiddleName, ' ', P.LastName) AS FullName,
FORMAT(SUM(TotalDue),'C') AS TotalSales,
FORMAT(SUM(TotalDue)/SUM(SUM(TotalDue)) OVER(),'P') AS PercentOfSales,
FORMAT(MIN(OrderDate),'yyyy.MM.dd') AS FirstOrderDate
FROM Sales.SalesOrderHeader AS SOH
JOIN Sales.Customer AS C ON SOH.CustomerID = C.CustomerID
JOIN Person.Person AS P ON P.BusinessEntityID = C.PersonID
WHERE C.CustomerID IN (11060, 11062, 11065, 11068, 11070, 12165)
GROUP BY C.CustomerID, P.FirstName, P.MiddleName, P.LastName;

The query uses the CONCAT function to build a FullName column. The CONCAT function ignores NULL values. Notice that rows 3 and 4 returned the FullName even though the MiddleName values are NULL. To make sure an extra space doesn’t show up when the MiddleName is missing, the space is combined with MiddleName as one of the arguments. NULL added to a space is NULL, which will be ignored.

There are three examples of FORMAT. The first example formats TotalSales as currency, in this case USD. The next example, PercentOfSales, formats a complex expression comparing each customer’s sales to the overall total. It uses a window aggregate function to calculate the total sales for the query. (Window aggregates is a great topic for another post!) The final example formats the FirstOrderDate in a custom format.

Here is one last easy function: EOMONTH.

SELECT CustomerID, MIN(OrderDate) AS FirstOrderDate,
EOMONTH(MIN(OrderDate)) AS LastDayOfMonth,
EOMONTH(MIN(OrderDate),1) AS LastDayOfNextMonth
FROM Sales.SalesOrderHeader
WHERE CustomerID IN (11060, 11062, 11065, 11068, 11070, 12165)
GROUP BY CustomerID;

The EOMONTH function returns the last day of the month for the supplied date argument. The second example, LastDayOfNextMonth, uses the optional parameter that adds another month.

I hope that using The Easy Functions will make your life easier!

If you want to get started with BIML with the help of experts, read more over at Fix Your SQL Server.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Iterating over Metadata with BimlScript – Notes from the Field #061

[Note from Pinal]: This is a 36th episode of Notes from the Field series. One of the common question I receive every other day is – I have learned BI but I feel that there is no automation in the BI field, what should I do? How do I go beyond learning BI? How can I fill the gap between BIDS and SSDT? If you have been reading this blog, when it is about BI, I always take help from LinchPin People who are BI experts. I requested Reeves from LinchiPin people to help me answer this unique question.

In this episode of the Notes from the Field series database expert Reeves Smith explains what is Biml and How Can it Help Me with SQL Server BI?. Read the experience of Reeves in his own words.


In previous articles (What is Biml and The Why of Biml), I’ve discussed Business Intelligence Markup Language (Biml) and given some examples of projects that would benefit from its use. I also demonstrated a quick way to get introduced to Biml without learning a lot of C#. As you grow in your understanding of Biml and BimlScript you will want to make your Biml files more dynamic with a programmatic metadata-driven approach. This article will build on the previous articles and add looping with metadata to build more dynamic Biml projects.

Overview

Starting where the previous article left off, I’ll demonstrate how to add a little C# to make a more automated method of metadata retrieval. If you did not walk through the other articles, I would suggest that you review them before you read this one. This article builds on the Biml in Listing 1 and lets you create multiple integration services packages from one Biml file.

Listing 1 – Semi-Automated Common Source Package Biml

Example – Automated Source Packages

This example shows you how to connect to a database and select the metadata to iterate over. Then you can create multiple SQL Server Integration Services (SSIS) packages from one Biml file. The example in the previous article manually updated the metadata and reran each Biml file. This is an effective approach, but you can build on it to create a more robust solution. Take a look at the C# code in Listing 2.

Listing 2 – C# Replacement Code

Replace the first line of code from Listing 1 with the code in Listing 2, which is an excerpt from Listing 3.

  • The first line will add the namespace to the Biml file and enable the calling of functions without a fully qualified name.
  • The second line creates a string variable that contains a connection string of the database with the metadata.
  • The third line creates a connection object (actually an AstOleDbConnection node) that will be used by the ImportDB function.
  • The forth line creates a result set with the metadata from the connection object. This object contains all of the metadata from the AdventureWorks2012 database. The ImportDB function has three parameter. The first parameter is a schema filter, and in the example the filter is on the Person schema. The second parameter is a table filter and is blank in the example, so it will be ignored. The last parameter is a list of ImportOptions, and the views have been excluded. To add more import options use the “|” followed by additional options. Example of multiple options:

ImportOptions.ExcludeForeignKey | ImportOptionsExcludeViews

The example includes an object that contains all of the metadata that you will loop over. You have to place the loop within the <Packages> node because a Biml file can only have one Packages collection. The foreach loop will create multiple <Package> nodes.

A tableName string was built below the foreach loop to reduce the code that you have to alter from the example in the previous article.

Listing 3 – Complete Code

Note: The code in listing 3 has the ConnectionString truncated to save space.

If you receive the error shown in Figure 1, it means that the tables do not exist in the stage database. To check this, add the Person filter to the table parameter in the ImportDB function and return metadata for only one table. The prior articles used this table, so it should exist.

Figure 1 – Error Message

Summary

This article added some more C# to help you refine your Biml files. I hope you can start to see the power of Biml and how it can help remove the repetitive work from your next Integration Services project.

If you want to get started with BIML with the help of experts, read more over at Fix Your SQL Server.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Year 2014 Reflections by Tim Radney – Notes from the Field #060

[Notes from Pinal]: New Year’s Resolutions is a funny word. Everybody makes it on January 1st but breaks it in very few days. I used to make resolutions too, but for a while, I have stopped this practice. Tim Radney is my great friend and I asked him that what can replace resolutions for the new year. He immediately said – Reflections on Previous Years. I really like how promptly he answered me. If you think, it does make sense to look back and see how much we have accomplished in this one year. We can build the next year’s plan based on what we have done and what we should have done. I think it is a great way to start a new year. I requested him to if he can give any example of Reflections on Previous Years.

On this new year, I also would like to congratulate Tim for joining SQLskills and beginning a new chapter in his career. I will repeat one of the statements which I told my friend – Tim is not like my brother, he is my brother. 


With the beginning of every year, many of us reflect back on the previous year and also while doing so set goals or resolutions for the upcoming New Year. I am hopeful that for all of you reading this that you get to reflect back to a prosperous year full of good health and knowledge gain.

With me being fairly active on social media such as Facebook and Twitter, each year I see resolutions people make for themselves ranging from getting in better health, writing a book, reading more books, blogging more, learning a new skill or finding a better job. Whatever goals or resolutions you may have for yourself, I recommend establishing smaller milestones to help you measure your success of reaching your goal. I have found success by treating these goals as a bunch of smaller task. That makes them much more manageable.

The year 2014 has been very good for me. I have made many new friends in the SQL community, I was awarded SQL Server MVP from Microsoft, released my third book, spoke at numerous SQL conferences including the PASS Summit, attended the Microsoft MVP Summit, was award D&B MVP, ran a half marathon, and decided to change employers for the first time in 17 years.

None of those things would have been possible without the SQL community and without setting goals for myself. I am one LUCKY guy, but at the same time I work hard and seize opportunities that are given to me. I have been lucky and blessed that others have invested their time into me. Others have seen things in me that they could train, encourage and mentor and have helped push me to be become stronger in my skill. I encourage you all to do the same for others.

Let 2015 be the year that you find a mentor of your own if you don’t have one, and start mentoring others if you aren’t already. We have a saying among the SQL speakers. If you want to learn something really well, get up to speak about it. That same can be said about helping others grow. If you want to really grow in your career, help others grow in theirs as well. The rewards are endless.

I don’t know what all 2015 will hold for me, but I will continue to study, read and learn as much as possible. As I learn new tips and tricks I will be blogging and tweeting to share my finds. I encourage you to do the same and help continue to make our SQL community great.

A special thank you to my very good friend Pinal Dave for helping to make 2014 a stellar year for me. I value our friendship a great deal and it was great to finally meet the rest of your family in Seattle.

If you want me to take a look at your server and its settings, or if your server is facing any issue we can Fix Your SQL Server.

Note: Tim has also written an excellent book on SQL Backup and Recovery, a must have for everyone.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Performance Tuning – Is It Really A Top Skills for a SQL Server Consultant? – Notes from the Field #059

[Note from Pinal]: This is a 59th episode of Notes from the Field series. I get over 100 emails everyday and out of 100 emails nearly 90 of them are really related to performance tuning. The emails I get is from various different IT professionals. There are quite a few application developers like (Dotnet, Java, PHP etc) who do application development everyday, but they are not always expert of SQL Server. When I read those emails I feel that the world indeed need plenty of SQL Server Performance Tuning consultants. I asked these questions to SQL Server Guru Brian Moran. He is a unique blend of person who knows the consultant’s business from technology and business perspective.

Today is Christmas Day and we have received a wonderful gift from Brian!

In this episode of the Notes from the Field series database expert Brian Moran explains in very simple words the reality of the SQL Server Consultant world. Read the experience of Brian in his own words.


December and January are times of the year when many people reflect on personal and career goals. It would be silly and naïve for me to pretend that I know what’s best for your career so take this post with a grain of salt. Maybe you want to be a fireman or an astronaut? I don’t have many words of wisdom if that is your hope.

But, I do have a lot of knowledge for data professionals who aspire to be independent consultants.

I focus more on the business side of running my consulting business these days so I’m not as technical as I used to be. But I was a SQL MVP for many years and was a pretty good tuning architect. I’ve been on the PASS Board twice and have been involved in the SQL Server space for almost 25 years. Over the years I’ve been part of the leadership team that built two successful consulting companies and I’ve coached and mentored dozens of technologists over the past two decades as they got their start in consulting.

This post is for folks who love the engine side of being a database pro and want to be an independent consultant. I’m not talking to the BI folks or the other myriad of specialties that exist in #SQLFamily. But do you love the engine? I mean really love working with the core engine? Do you want to be a consultant one day?

World class and amazing performance tuning skills are consistently the on the short list of skills that I’ve seen successful independent SQL Server consultants have over the past 20 years. Are there other skills out there that allow a consultant to have a great career? Of course. You can do well with any specialty that is hard to master and that companies absolutely must have access to in order to be successful. HA/DR, consolidation, virtualization, security, and a wide variety of other topics come to mind. But here is one element of performance tuning that that tends to create opportunities for consultants to maximize their rates and have a successful career.

Top performance tuning consultants can have a pretty quick impact on a problem and companies often need access to the skill on short notice for critical needs. Sometimes tuners can solve a problem in minutes or hours. And long term performance tuning work is often measured in days or weeks rather than weeks or months that some other types of projects require.

Here is some basic math to think about. Let’s say you want to bill $250/hr. Is that expensive? Well, like any SQL question the answer is it depends. Let’s look at it this way.

  • Assume you can solve a critical problem for a customer in 20 hours. $250/hr for 20 hours is $5,000.
  • Imagine you are competing against someone who says they can solve the problem for $125/hr.
  • 50% less sounds like a big savings for the customer, right? But, it’s really only a net savings of $2,500.

Do you think many companies care about a difference of $2,500 when they desperately need to solve an urgent problem impacting critical business systems? Probably not. They probably care more about picking the person they believe is the best fit and most likely to solve the problem.

But now let’s imagine that you are talking to a potential customer about a project that might take 6 months to solve working full time. That’s about 1000 billable hours. Let’s make the same assumptions that you are charging $250/hr and someone else is going to offer $125.hr. Now the 50% differential is rate yields a net savings of $125,000.

Will many companies care about saving $125,000 if they can? Yep. They sure do.

I find that many folks never fully think through this side of the consulting equation early in their career. IE, within reason companies, mostly don’t care about the rate for performance tuners as long as you are awesome and can get the job done. Don’t get me wrong. It’s quite possible to charge premium rates for long term projects and you can certainly be a successful consultant while, focusing on some other aspect of SQL Server. But almost 25 years in the business have taught me that loving performance tuning and wanting to be an independent consultant can be a match made in heaven.

If you want to get started with performance tuning and database security with the help of experts, read more over at Fix Your SQL Server.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Creating the SSIS Catalog – Notes from the Field #058

[Notes from Pinal]: SSIS is very well explored subject, however, there are so many interesting elements when we read, we learn something new. A similar concept has been Creating the SSIS catalog.

Linchpin People are database coaches and wellness experts for a data driven world. In this 58th episode of the Notes from the Fields series database expert Tim Mitchell (partner at Linchpin People) shares very interesting conversation related to how to Creating the SSIS Catalog.


With the release of SQL Server 2012, Integration Services (SSIS) received a significant overhaul.  The most notable change was the addition of the SSIS catalog, a new deployment and storage mechanism that helps to centralize and standardize the storage, execution, and logging of SSIS packages.  In this post, I will briefly illustrate how to set up the SSIS catalog on a new installation of SQL Server 2012 or 2014.

The SSIS Catalog

When connecting to an instance of SQL Server using SSMS 2012 or 2014, you’ll notice that there is a new node in the tree, entitled Integration Services Catalogs.  By default, this node will be empty as shown below.

Before we can begin deploying SSIS packages to the catalog on this instance, we need to create the catalog.  It is worth noting here that, although the plural node name of Integration Services Catalogs implies that we may have more than one catalog, we can only create one SSIS catalog per SQL Server instance.

Creating the SSIS Catalog

To create the SSIS catalog, simply right click the Integration Services Catalogs node and choose Create Catalog.

Next, you’ll be presented with the Create Catalog dialog, which allows you to set the options for creating the catalog.  There are really only two things that you need to configure here:

  • Enable automatic execution of Integration Services stored procedure at SQL Server startup. This option will enable the automatic execution of a cleanup stored procedure when SQL Server starts.  This stored procedure will check for and update any packages whose status in the catalog was left in limbo due to an unexpected shutdown of SQL Server.  I don’t know of any downsides to enabling this, so I recommend that you leave this box checked.
  • Encryption password. Because you can store sensitive information (such as database or FTP passwords) in the SSIS catalog, this database requires an encryption password to properly secure those assets.

You’ll also notice that the name of the catalog database (SSISDB) is presented in this window, but it is not editable.  In the current versions (2012/2014) of SSIS, the name of the catalog database must be set to SSISDB.  Also, the SSIS catalog requires enabling CLR integration, so this option is shown but is also not editable.

Once you’ve configured the encryption password and selected the option of whether to allow the cleanup stored procedure to run, you’re ready to create the catalog.  Simply click the OK button in this window to create the catalog.  After so doing, you can browse to the Object Explorer in SSMS and see that we now have the SSISDB catalog listed under the Integration Services Catalogs node.  The catalog is now ready for use!

One final note on the SSIS catalog: If you browse the list of databases after creating the catalog, you’ll notice that there is a new user database named SSISDB in the list of databases.  The SSIS catalog is contained in its own database on the SQL Server instance on which it is installed.  Unlike prior versions of SSIS which stored the SSIS assets in MSDB, the SSIS catalog uses a dedicated user database to better isolate the SSIS storage.

Conclusion

With a new installation of SQL Server 2012 or 2014, the SSIS catalog is not installed by default.  This brief post demonstrated how to easily and quickly set up the SSIS catalog on a new installation of SQL Server.

If you want me to take a look at your server and its settings, or if your server is facing any issue we can Fix Your SQL Server.

Reference: Pinal Dave (http://blog.sqlauthority.com)