SQL SERVER – The Basics of the Execute Package Task – Notes from the Field #067

[Note from Pinal]: This is a new episode of Notes from the Field series. SQL Server Integration Service (SSIS) is one of the most key essential part of the entire Business Intelligence (BI) story. It is a platform for data integration and workflow applications.

In this episode of the Notes from the Field series I asked SSIS Expert Andy Leonard a very crucial question – What are the Basics of the Execute Package Task and where do we start with it? Andy was very kind to answer the questions and provides plenty of information about how a novice developer can learn SSIS from the beginning and become expert in the technology.


Learning how to configure and use the SQL Server Integration Services (SSIS) Execute Package Task gives you a great starting point for understanding SSIS package orchestration. I advocate writing small, functional SSIS packages that perform a unit of work. That’s a great idea for many reasons. But it begs the question: “How do I string together the execution of these packages?” Remember: SSIS is a software development platform. With “SQL Server” included in the name, it is easy for people to confuse SSIS as a database tool or accessory, but Control Flow Tasks put that confusion to rest.

SSIS provides several Control Flow tasks. Here is a list that provides a good approximation of which tasks I use most, from most-used to least-used:

In this article I provide a basic example of configuring the SSIS Execute Package Task, shown in Figure 1:


Figure 1: SSIS Execute Package Task

The Execute Package Task provides one way to implement an SSIS Design Pattern for SSIS package execution known as the Parent-Child pattern. When an SSIS package uses the Execute Package Task to start another SSIS package, the package with the Execute Package Task is called the Parent and the package started by the Execute Package Task is called the Child.

The Execute Package Task changed between SQL Server 2008 R2 Integration Services and SQL Server 2012 Integration Services. The changes support the SSIS 2012 (and 2014) Catalog. There is a new property called ReferenceType which defaults to “Project Reference” in packages executing in Project Deployment Mode (the default mode for building SSIS 2012 and SSIS 2014 SSIS packages), as shown in Figure 2:


Figure 2: Options for the Execute Package Task ReferenceType Property

Project Reference is used to execute an SSIS package – a Child package – in the same SSIS project with the package that contains the Execute Package Task – the Parent package. When Project Reference is selected, the next property in the property grid is PackageNameFromProjectReference, a dropdown containing a list of all the SSIS package in the SSIS project.

Setting the ReferenceType property to External Reference is a way to execute SSIS packages that are stored in the file system or the msdb database. When External Reference is selected the next properties in the property grid change to reflect this backwards-compatible functionality, as shown in Figure 3:


Figure 3: Setting the ReferenceType Property to External Reference

The Location and Connection properties are used to specify an OLE DB Connection (to the msdb database) or a File Connection (to the location of the dtsx file). SQL Server locations also require the name of the SSIS package; the File System option does not because the SSIS package is the only thing in the file. The External Reference ReferenceType setting is useful for importing SSIS solutions that contain earlier versions of the Execute Package Task because External Reference behaves exactly like previous versions of the Execute Package Task.

The remaining properties in the property grid – Password and ExecuteOutOfProcess – are shared between ReferenceType options. Password is used if the SSIS package is password-protected (if the ProtectionLevel property of the SSIS package is set to either EncryptSensitiveWithPassword or EncryptAllWithPassword). ExecuteOutOfProcess is a setting that determines whether the package will be executed as part of the current process or a new process.

When executing SSIS packages in Project Deployment Mode (using the Project Reference ReferenceType), Parameter Bindings are enabled and allow values to be passed from the Parent package to the Child package, as shown in Figure 4:


Figure 4: Binding a Parent Package Parameter to a Child Package Parameter

Parameter Bindings are disabled when the ReferenceType property is set to External Reference.

Once configuration is complete, click the OK button to close the Execute Package Task Editor. You can test execution by pressing the F5 key or selecting “Start Debugging” from the SSIS dropdown menu. A successfully-executed Execute Package Task will appear as shown in Figure 5:


Figure 5: A Successful Execution!

The Execute Package Task drives a powerful data integration architecture pattern: Parent-Child execution. Using SSIS Precedent Constraints and Sequence Containers with the Execute Package Task, a data integration developer can develop SSIS “driver” packages that call Child SSIS packages in any combination of parallel and serial orders of execution.

If you want to get started with SSIS with the help of experts, read more over at Fix Your SQL Server.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – The Basics of the SSIS Script Task – Notes from the Field #065

[Note from Pinal]: This is a new episode of Notes from the Field series. SQL Server Integration Service (SSIS) is one of the most key essential part of the entire Business Intelligence (BI) story. It is a platform for data integration and workflow applications.

In this episode of the Notes from the Field series I asked SSIS Expert Andy Leonard a very crucial question – What are the basics of the SSIS script task and where do we start with it? Andy was very kind to answer the questions and provides plenty of information about how a novice developer can learn SSIS from the beginning and become expert in the technology.


Learning how to configure and use the SQL Server Integration Services (SSIS) Script Task gives you a great starting point for learning how to use SSIS Control Flow tasks. You’ll find that SQL Server Control Flow tasks provide powerful development capabilities that may surprise you. Let there be no mistake, SSIS is a software development platform. With “SQL Server” included in the name, it is easy for people to confuse SSIS as a database tool or accessory, but Control Flow Tasks put that confusion to rest.

SSIS provides several Control Flow tasks. Here is a list that provides a good approximation of which tasks I use most, from most-used to least-used:

This article will focus on a basic example of configuring the SSIS Script Task, shown in Figure 1. People often think the SSIS Script Task is just for generating log messages, but let’s look at some of its more powerful capabilities.

Figure 1: SSIS Script Task

The Script Task supports Microsoft Visual C# and Microsoft Visual Basic languages. You can choose which language you will use prior to clicking the Edit Script button. Once you’ve clicked the Edit Script button, you cannot change the Script Language property.

At the heart of the Script Task is the ability to read a variable value from the SSIS package into the script and then write a message out of the Script Task. Because the Script Task can interact with SSIS variables, you can use .Net code to manipulate and respond to variable values.

For the purpose of our demonstration, we will add code to the Script Task to create a message for SSIS logging.

Messaging is a key component of enterprise architecture. Messages are generated (published, raised, produced, etc.) by Senders and received (subscribed, consumed, requested, etc.) by Listeners. Messaging is a complex topic. Entire books are dedicated to the topic.

Add variables to the ReadOnlyVariables and ReadWriteVariables properties by clicking the ellipsis inside the value textbox for each property. When you click the ellipsis, a Select Variables dialog appears, as you see in Figure 2.

Figure 2: Adding Read Only Variables to the Script Task

SSIS variables use a two-part naming convention: <Namespace>::<VariableName>. I added the SSIS variables System::TaskName and System::PackageName. The selected variables are in the System namespace and contain the name of the task and package, respectively.

Click the Edit Script button to open the script editing window as shown in Figure 3:

Figure 3: The Script Task Script Editor

Add the following code beneath the instruction to Add your code here:

Dim sPackageName As String = _ Dts.Variables("System::PackageName").Value.ToString
Dim sMsg As String = "I am " & sPackageName
Dts.Events.Fireinformation(1001, sPackageName, sMsg, "", 0, True)

Listing 1: VB.Net Code

The first line of the code in Listing 1 creates a VB.Net variable named sPackageName and initializes the value of this variable, setting it to the value of the SSIS Variable named System::PackageName. This is one way to read SSIS Variable values into a Script Task’s script.

The second line of the code in Listing 1 creates a VB.Net variable named sMsg and initializes the value of this variable with the text, “I am” plus the value contained in the sPackageName VB.Net variable.

The third line of the code shown in Listing 1 calls the FireInformation method in the Dts.Events assembly. FireInformation accepts six arguments:

  1. InformationCode – an integer you can use to identify custom messages sent from the SSIS package.
  2. SubComponent – a string you can use to populate the source of the message.
  3. Description – a string containing the message you wish to send.
  4. HelpFile – a string containing the location of a Help file (should a help file exist).
  5. HelpContext – an integer pointing to the help context (should a help context exist).
  6. FireAgain – a Boolean value indicating whether the event should subsequently fire.

My VB code includes a reference to the FireAgain argument. Note: This MSDN post states: “Because firing of an event may be expensive, the run-time engine provides a mechanism for suppressing events that you are not interested in. Every event firing method has a FireAgain parameter. If the value of this variable is false, after the method returns, the caller will not fire this event again for the duration of the current execution.” I understand this to mean setting FireAgain to False would suppress future FireInformation events in my Script Task, but testing indicates this is not correct. This is important for you to know because you may opt to use the FireAgain argument to manage when you raise events. I do not use the FireAgain argument for this purpose. I typically manage the decision to raise events in conditional logic surrounding the Event-raising code.

A test execution of the package reveals an Information message recorded on the Progress / Execution Results tab as shown in Figure 4.

Figure 4: the Information Message on the Progress / Execution Results Tab

Information messages generated from Script Tasks will appear in SSIS logs and the SSIS Catalog. With this information, you can monitor the progress of SSIS execution while the package is running, or view the historical status of variable values or package messages after the package execution has completed.

The SSIS Script Task can accomplish much more than generating log messages. This brief introduction and basic example have demonstrated how to get started configuring and using the SSIS Script Task. As you can see, SSIS Script Tasks give you development capabilities you may not have been aware of.

If you want to get started with SSIS with the help of experts, read more over at Fix Your SQL Server.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – How to Know Backup History of Current Database?

Some blog post and scripts are like ever green references. One such blog has been Get Database Backup History for a Single Database. A number of you have pinged me and for amendments to that script with various ideas. Since that post was written more than 4+ years now, I thought it would be worthwhile to change it to current trends of SQL Server 2014 additions. What are the changes made to the previous script? The script will answer few questions like:

  • What were the different backups taken on the current DB? Remove the condition mentioned below and it can be generic enough for all DBs on your server.
  • Which user was involved in taking the backup?
  • What is the type of backup we are talking about?
  • Recovery Model and Database compatibility level of the DB at the time the backup was taken. I remember using this for a user, who used to change the DB compatibility in their script before a backup. It was basically a copy-paste problem from an internet script. This data helped there.
  • Size of backup – Both compressed and uncompressed.
  • If the Backup was password protected.
  • Finally, when the backups were taken.

So let us look at the script next. Feel free to modify the same as you wish.

-- Recent backup history for the current DB
SELECT s.database_name 'Database',
s.recovery_model 'Recovery Model',
s.compatibility_level,
s.USER_NAME 'Backup by Username',
CASE s.TYPE
WHEN
'D' THEN 'Full'
WHEN 'I' THEN 'Diff'
WHEN 'L' THEN 'Log'
END 'Backup Type',
CONVERT(VARCHAR(20), s.backup_finish_date, 13) 'Backup Completed',
CAST(mf.physical_device_name AS VARCHAR(100)) 'Physical device name',
DATEDIFF(minute, s.backup_start_date, s.backup_finish_date) 'Duration Min',
CAST(ROUND(s.backup_size * 1.0 / ( 1024 * 1024 ), 2) AS NUMERIC(10, 2)) 'Size in MB',
CAST(ROUND(s.compressed_backup_size * 1.0 / ( 1024 * 1024 ), 2) AS NUMERIC(10, 2)) 'Compressed Size in MB',
CASE WHEN LEFT(mf.physical_device_name, 1) = '{' THEN 'SQL VSS Writer'
WHEN LEFT(mf.physical_device_name, 3) LIKE '[A-Za-z]:\%' THEN 'SQL Backup'
WHEN LEFT(mf.physical_device_name, 2) LIKE '\\' THEN 'SQL Backup'
ELSE mf.physical_device_name
END 'Backup tool',
s.is_copy_only,
s.is_password_protected,
s.is_force_offline /* for WITH NORECOVERY option */
FROM   msdb.dbo.backupset s
INNER JOIN msdb.dbo.backupmediafamily mf ON s.media_set_id = mf.media_set_id
WHERE  s.database_name = DB_NAME() AND  -- remove this condition if you want all DBs
s.backup_finish_date > DATEADD(MONTH, -3, GETDATE()) -- Get data for past 3 months
ORDER BY s.backup_finish_date DESC;

A sample output would look like:

There can be more additional fields added to this script like: If encryption is enabled, Collation information, LSN information, if backup has checksum etc. Do let me know which additional information do you use in your environment to know your backups better.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Can Database Primary File Have Any Other Extention Than MDF

Here is interesting question I received yesterday. I personally think it is one of the most interesting questions I have received this month.

“Can Database Primary File Have Any Other Extention Than MDF?”

Before you read the answer – think for a moment.

Now read the answer in the next line.

Yes, Of course. MDF file extension of the primary data file of the database is just a normal practice, but essentially, you can use any other database file extension as well.

Here is an example where I am able to attach files with PDF extension to the database. Please note that this file was created by SQL Server earlier and was always used for primary data file.

USE [master]
GO
CREATE DATABASE [tests] ON
( FILENAME = N'D:\data\tests.pdf' ),
(
FILENAME = N'D:\data\tests_log.ldf' )
FOR ATTACH
GO

As you can see that other extensions are successfully restored to the database. If you want to practice this example, you can download this example by clicking over here.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – The Basics of the SSIS Data Flow Task – Notes from the Field #057

[Note from Pinal]: This is a new episode of Notes from the Field series. SQL Server Integration Service (SSIS) is one of the most key essential part of the entire Business Intelligence (BI) story. It is a platform for data integration and workflow applications. As wikipedia says – It features a fast and flexible data warehousing tool used for data extraction, transformation, and loading (ETL). The tool may also be used to automate maintenance of SQL Server databases and updates to multidimensional cube data.

In this episode of the Notes from the Field series I asked SSIS Expert Andy Leonard a very crucial question – How to learn SSIS data flow task? Andy was very kind to answer the questions and provides plenty of information about how a novice developer can learn SSIS from the beginning and become expert in the technology.


If you know SQL Server, you’re likely aware of SQL Server Integration Services (SSIS). What you might not realize is that SSIS is a development platform that allows you to create and perform some interesting Control Flow Tasks. In the first blog post in this series, I showed how to use the Execute SQL Task. Now, let’s look at the Data Flow Task. When developing solutions with SSIS, I use a handful of Control Flow tasks:

  • Execute SQL Task
  • Data Flow Task
  • Script Task
  • Execute Package Task
  • File System Task
  • Execute Process Task

This list is a good approximation of which tasks I use most, too – from most-used to least-used. In this article I provide a basic example of configuring the SSIS Data Flow Task, shown in Figure 1:


Figure 1: SSIS Data Flow Task

The SSIS Data Flow Task is a very special task. It is the only task to have its own tab in the Integrated Development Environment (IDE) as shown in Figure 2:


Figure 2: The Data Flow Tab

If you click on the tab, you will note a new SSIS Toolbox containing Data Flow-specific components, as shown in Figure 3:


Figure 3: Data Flow SSIS Toolbox

SSIS Data Flows are typically used to move data from one location to another. The data flow accomplishes data movement by first reading data into Data Flow Buffers. Think of a buffer as a region of memory SSIS uses to hold data rows as the rows are processed by the data flow. In Figure 4, I have configured an OLE DB Source Adapter to pump data rows into the data flow:


Figure 4: Configuring an OLE DB Source Adapter

The data is often transformed while being moved from one location to another. The SSIS data flow components that perform transforming operations are called Transformations, and they are joined to other data flow components by Data Flow Paths. An example of a transformation is the Derived Column Transformation, as shown in Figure 5:


Figure 5: Adding a Derived Column Transformation and a Data Flow Path

You can use transformations to perform many operations (e.g., you can manipulate values of columns in rows, you can remove or redirect rows based on column values, etc.) on the data as it flows through the data flow task. For example, the Derived Column Transformation permits you to manipulate (transform) existing data or to combine existing data to create new columns, as shown in Figure 6:


Figure 6: Creating a New Column with the Derived Column Transformation

I created a new column named “UpperCaseName” in the Derived Column Transformation. I used SSIS Expression Language to define the transform – “UPPER([Name])” in this case.

Note: SSIS Expression Language is very powerful and very difficult to learn. For more information, please see Linchpin People’s SSIS Expression Language series.

Now I need to land these rows into a table. I’ll use an OLE DB Destination Adapter – connected from the Derived Column Transformation via data flow path – to accomplish loading our transformed rows into a table, as shown in Figure 7:


Figure 7: Configuring an OLE DB Destination Adapter

Once the OLE DB Destination Adapter is configured, you can execute either the package or the Data Flow Task as shown in Figure 8:


Figure 8: Test Execution Successful!

In this article, I shared an introduction to the SSIS Data Flow Task and some of its functionality. Although I barely scratched the surface of Data Flow capabilities, you should now be able to compose and test your first SSIS Data Flow!

:{>

If you want to get started with SSIS with the help of experts, read more over at Fix Your SQL Server.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Proof of Concepts with Windows Azure – Notes from the Field #056

[Note from Pinal]: This is a 56th episode of Notes from the Fields series. If you are DBA and Developer there is always a situation when you want to prove yourself by building a small proof of concepts of your idea. However, most of the time, it is way more complicated than we think. Building proof of the concepts required many different resources and skills. Above all there are chances that what we have built is not upto the mark and we have to rebuild the example one more time. Trust me it is never simple those tasks which looks simple from the beginning.

In this episode of the Notes from the Field series database expert John Sterrett (Group Principal at Linchpin People) explains a very common issue DBAs and Developer faces in their career – how to build proof of concepts and how to maximize the power of Azure. Linchpin People are database coaches and wellness experts for a data driven world. Read the experience of John in his own words.


Whether you know it or not cloud services are here and they are changing the way we will provide information technology services. For example, in many information technology shops it can take weeks if not months to get an instance of SQL Server up and running.  Here are some minimal action items that must be completed before DBA’s get access server to install SQL Server.  You have to order a physical server, your procurement team must approve the order, and the server has to be shipped. Once the server is received the server must be racked in the data center, cables must be connected, and the data center team needs to document their changes. Then the operations team needs to install and configure windows. I could keep going but there are a lot of things that should be done to a server before the DBA team gets its hands on it. What are you going to do if you’re a DBA and you need instance up in 30 minutes for a proof of concept? It’s becoming more common that the cloud is the answer.

Every time I need a server for a proof of concept I jump to Windows Azure.  I can quickly build a Windows Azure Machine with SQL Server provided within 30 minutes.  In this tip, I am going to walk through the steps to create your first Windows Azure Machine.

1. Get Azure Windows Account. If you don’t have one you can get a free trial. At the time of writing this tip your free trial would include $220 of credit.
http://azure.microsoft.com/en-us/pricing/free-trial/

Are you an MSDN Subscriber who isn’t leveraging Microsoft’s cloud service? If so, you can sign up for free monthly credit.

2. Log In to Azure Portal
https://manage.windowsazure.com

3. Create New Virtual Machine

On the left hand side, click on Virtual Machines and then the add button on the bottom of the left side of the screen. This will load our wizard for creating our first virtual machine.

Now that wizard is loaded as you can see below we can select virtual machine and create it from the gallery. In the Gallery we will be able to select one of many images used that already includes SQL Server baked in.

Looking at the SQL Server images you will see you can access Enterprise Edition, Standard Edition and Web Edition for SQL 2014 down to SQL 2008 R2.

 

Next you can customize your image by release date. This will allow you to have different service packs or CUs. You can also select between two different tiers and sizes.  You will have to create a user name and password and you will want to keep this credential as it will be your first account.

 

Next you will be able to select more machine configuration options. You will get to determine where the Azure Virtual Machine is located.  Below you will see I am using my MSDN Subscription.

 

Finally, you will get to configure more configuration extensions to help automate or secure your virtual machine.

Finally, you will see your server being provisioned. What once use to take weeks or months can now be done in the cloud in minutes.

Are your servers running at optimal speed or are you facing any SQL Server Performance Problems? If you want to get started with the help of experts read more over here: Fix Your SQL Server.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – FIX – The database ‘model’ is marked RESTORING and is in a state that does not allow recovery to be run

The best way one can learn SQL Server is by trying out things on their own and I am no different. I constantly am trying to explore the various options one can use when working with SQL Server. In the same context, when I was playing around with backup restore commands, I made a mistake and unfortunately restarted SQL Server. After that I was unable to start SQL Service.  If I start the service, it doesn’t give any error but gets stop automatically.

Whenever I have any weird problems with SQL, I always look at ERRORLOG files for that instance. If you don’t know the location of Errorlog, you should refer Balmukund’s blog (Help : Where is SQL Server ErrorLog?)

This is what I found in ERROLROG just before the stop.

2014-10-28 002039.02 spid9s      Starting up database 'model'.
2014-10-28 002040.01 spid9s      The database 'model' is marked RESTORING and is in a state that does not allow recovery to be run.
2014-10-28 002040.04 spid9s      Error 927, Severity 14, State 2.
2014-10-28 002040.04 spid9s      Database 'model' cannot be opened. It is in the middle of a restore.

The error and behavior which I am seeing makes sense because to start SQL Server, we need master, model and tempdb database. You might think that MSDB is also a system database and would be needed for SQL Engine? Well, you might have been tricked. MSDB is needed for SQL Server Agent Service, not SQL Server Service. So, my master is fine, model has some problem. Every new database is created using model, including TempDB so SQL Service is refusing to start. Since the model database is not recovered successfully, SQL Server cannot create the tempdb database, and the instance of SQL Server does not start understandably.

So I called up Balmukund – these are the perks of having a good friend to rely. He never says “no” but he also doesn’t give complete solution to the problem. He gives hint and asks me to research further. This time also magical words were – “use trace flag 3608 and restore model with recovery”.

I followed his advice and performed below steps.

1. Start SQL Server with trace flag 3608 using net start command

Net Start MSSQL$SQL2014 /T3608

In my case SQL2014 is the name of the instance. If you have default instance then service name would be MSSQLServer. For named instance, it is MSSQL$InstanceNameHere

2. After starting with trace flag 3608, I verified the same from Errorlog as well.

Further, I also found below message in ERRORLOG.

Recovering only master database because traceflag 3608 was specified. This is an informational message only. No user action is required.

3. Connected to SQL Instance using SQLCMD by below command.

SQLCMD -S .\SQL2014 -E

You can read parameter of SQLCMD at Books online here

“1>” means we are connected to SQL Instance and then Executed below command (hit enter at end of each line)

RESTORE DATABASE Model WITH RECOVERY
GO

4. Once the command is executed successfully, we will come back to “1>” again. We can type exit to come out of SQLCMD

5. Now stop SQL Service

Net Stop MSSQL$SQL2014

6. And start again without trace flag.

Net Start MSSQL$SQL2014

Now my SQL instance came up happily and I was unblocked.  After sometime I got call from Balmukund asking if SQL is started and I told that my next blog is ready on the same topic.  He finally asked, how did that happen? And my answer was – I ran wrong command. My command was

BACKUP DATABASE model TO DISK = 'Full.bak'
GO
BACKUP LOG model TO DISK = 'Log.trn' WITH NORECOVERY
GO

My request to my reader is that please DONOT run the above command in your SQL instance and restart SQL else you need to follow the steps in production server. Learning never stops when working with SQL Server.

Reference: Pinal Dave (http://blog.sqlauthority.com)