SQL SERVER – FIX – Export Error – ‘Microsoft.ACE.OLEDB.12.0′ provider is not registered on the local machine

Working with Excel files can always be of great interest. I have not seen a reporting solution in the world that doesn’t give an option to export to excel. If you are talking about two separate solutions and you want to export and import, then such options can be of great use. In this blog, I was trying to import data from Excel sheet to SQL Instance by using Import Wizard available in SQL Server Management Studio. I am sure all of us have had this requirement once in a while and would have surely tried to get this working.

ace error 01 SQL SERVER   FIX   Export Error   Microsoft.ACE.OLEDB.12.0 provider is not registered on the local machine

After selecting the option, I selected Data Source as “Microsoft Excel” and selected file name which I wanted to import. Excel file was created with latest version of excel so I selected “Microsoft Excel 2007”.

ace error 02 SQL SERVER   FIX   Export Error   Microsoft.ACE.OLEDB.12.0 provider is not registered on the local machine

As soon as I hit next, I got below error

TITLE: SQL Server Import and Export Wizard
——————————
The operation could not be completed.
——————————
ADDITIONAL INFORMATION:
The ‘Microsoft.ACE.OLEDB.12.0′ provider is not registered on the local machine. (System.Data)
——————————
BUTTONS:
OK
——————————

ace error 03 SQL SERVER   FIX   Export Error   Microsoft.ACE.OLEDB.12.0 provider is not registered on the local machine

I checked management studio and as the message says, I didn’t see the provider as we can see below. We can also use below query to find the details of providers installed on the machine

EXECUTE MASTER.dbo.xp_enum_oledb_providers

ace error 04 SQL SERVER   FIX   Export Error   Microsoft.ACE.OLEDB.12.0 provider is not registered on the local machine

Few blogs have mentioned to download https://www.microsoft.com/en-us/download/details.aspx?id=23734 (2007 Office System Driver: Data Connectivity Components) but that didn’t work. So I tried installing http://www.microsoft.com/en-us/download/details.aspx?id=13255 (Microsoft Access Database Engine 2010 Redistributable) but it also failed because I am using 32 bit version of office on my 64 bit PC.

Note: If you are running 32-bit version of SQL then your issue will be solved by installing above. In my PC, I have to take below approach.

  1. Uninstall 32-bit version of Microsoft Office.
  2. Install the 64-bit version of MS Office (Make sure setup is run from “x64″ folder and run setup.exe in order to install the 64 bit of MS Office).
  3. Install the 64-bit version “Microsoft.ACE.OLEDB.12.0″ (download AccessDatabaseEngine_x64.exe from second link provided earlier).

Here is the location of DLL “ACEOLEDB.DLL”

  1. On a 32-bit operating system
    1. C:\Program Files\Common Files\Microsoft Shared\OFFICE14\ACEOLEDB.DLL
  2. On a 64-bit operating system
    1. If 32-bit office is installed then

C:\Program Files (x86)\Common Files\Microsoft Shared\OFFICE14\ACEOLEDB.DLL

  1. If 64-bit office is installed then

C:\Program Files\Common Files\Microsoft Shared\OFFICE14\ACEOLEDB.DLL

Production Server would not have complete office installed so it would not be a big deal to install correct version of driver.

Have you ever encountered this error in your environments? What did you do to solve them? Let me know.

Reference : Pinal Dave (http://blog.SQLAuthority.com)

SQL SERVER – PowerShell to Count Number of VLFs in SQL Server

If you are a seasoned DBA, then the conversations and monitoring VLFs is something you are already doing. To learn about basics of VLF, check my earlier blog at: SQL SERVER – Detect Virtual Log Files (VLF) in LDF. Having large number of VLFs have a performance impact and there are a number of blogs that discuss the same. As a DBA in this blog I thought of showing a simple script that will help identifying the number of VLFs available inside a SQL Server instance or a number of instances.

As part of my powershell learnings, this is a simple script that you can use in your environment and take corrective action if you have large number of VLFs in your servers.

<#
EXAMPLE
   Get-VLFCount -ComputerName ComputerName
EXAMPLE
   Get-Content D:\ServerList.txt | Get-VLFCount

   If you are using Windows Authentication and have a list of servers you can use this.
   Get-VLFCount -ComputerName ComputerName1, ComputerName2
#>
function Get-VLFCount
{
    [CmdletBinding()]
    Param
    (
        # SQLInstance is the name(s) of the SQL Instances to scan
        [Parameter(Mandatory=$true,
                   ValueFromPipeline=$True,
                   ValueFromPipelineByPropertyName=$true,
                   Position=0)]
        $SQLInstance
    )
    Begin
    {
        #Load the Assembly to connect to the SQL Instance.
        [System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SMO') | out-null
        [System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.ConnectionInfo') | out-null

    }
    Process
    {
        ForEach ($Instance in $SQLInstance)
        {
            $SrvConn = new-object Microsoft.SqlServer.Management.Common.ServerConnection
            $SrvConn.ServerInstance=$Instance
            #Use Integrated Authentication
            $SrvConn.LoginSecure = $true
            $SrvConn.ConnectTimeout = 5
            Write-Debug "Attempting to check $Instance for VLF Counts"
            $srv = new-object Microsoft.SqlServer.Management.SMO.Server($SrvConn)
            $dbs = $srv.Databases
            try
            {
                ForEach ($db in $dbs)
                {
                    Write-Debug "Getting VLFInfo for $db"
                    if ($db.IsAccessible)
                    {

                        $VLFs = $db.ExecuteWithResults("DBCC LOGINFO")
                        $NumVLFs = $VLFs.Tables[0].Rows.count

                        $VLFinfo = $db | Select @{Name='Instance Name'; expression={$Instance}}, @{Name='Database Name'; Expression = {$_.name}} `
                        , @{Name='VLFCount()'; Expression={$NumVLFs}}

                    }
                    else
                    {
                        $VLFInfo = New-Object psobject
                        $VLFInfo | Add-Member -type NoteProperty -name InstanceName ($Instance)
                        $VLFInfo | Add-Member -type NoteProperty -name DBName ("$DB is Inaccessible")
                        $VLFInfo | Add-Member -type NoteProperty -name VLFCount 0
                    }
                    Write-Output $VLFinfo
                }

            }
            catch
            {
                $ex = $_.Exception
                Write-Debug "$ex.Message"
                Write-Error "Could not pull SQL DB Info on $Instance"
            }

        }
    }
}

#If we had our instances in a file, we can use
Get-VLFCount 'localhost'

The above script when run on my server returns the following:

PS VLF Count 01 SQL SERVER   PowerShell to Count Number of VLFs in SQL Server

As you can see in our example output, our AdventureWorks database has large number of VLFs. Incase you have hundreds of VLFs in your environments, then you can look at SQL SERVER – Reduce the Virtual Log Files (VLFs) from LDF file to reduce the number of VLFs for your databases.

Just out of curiosity, I would love to know what is the highest number of VLF in your databases? How long have these been running in your systems? It would be great if you can share the same with us over comments so that it would a great learning for me too.

Reference : Pinal Dave (http://blog.SQLAuthority.com)

SQL SERVER – Evaluation Period Has Expired – How to Activate SQL Server?

Sometimes we are in a hurry that we forget activities that are pending. A typical example would be like walking into a room and we keep thinking about why we were there in the first place. Isn’t that common? Or I think age is catching on me for sure. Some of the classic mistakes I have seen people doing is to bring up an environment in a hurry and choose the Evaluation Edition because they don’t have the key handy at that point in time. They plan to add the same at a later point in time and forget it. One fine day SQL starts to error out and on inspection, you hit the problem and all work comes to a stop.

If you have been running SQL Server and it is expired, there would be a need to activate it. The steps are a little different as compared to Windows activation. The edition terminology is also little different from the Windows world. In SQL Server, evaluation edition is actually an enterprise edition of SQL Server, having 180 days’ time limit – so it is called as Enterprise Evaluation Edition

If the SQL Server engine is expired, the SQL services would not start and you will find below error in SQL Server error log and event viewer. Here is the blog to find a SQL ERRORLOG file location

SQL SERVER – Where is ERRORLOG? Various Ways to Find its Location

Here is the sample ERRORLOG. I have highlighted two things below – current version and error message. The same message will come in event log also.

2015-03-25 10:34:06.84 Server      Microsoft SQL Server 2012 (SP1) - 11.0.3128.0 (X64)
Dec 28 2012 20:23:12
Copyright (c) Microsoft Corporation
Enterprise Evaluation Edition (64-bit) on Windows NT 6.2 <X64> (Build 9200: ) (Hypervisor)
2015-03-25 10:34:06.84 Server      (c) Microsoft Corporation.
2015-03-25 10:34:06.84 Server      All rights reserved.
2015-03-25 10:34:06.84 Server      Server process ID is 6584.
2015-03-25 10:34:06.84 Server      Authentication mode is MIXED.
2015-03-25 10:34:06.84 Server      Logging SQL Server messages in file 'C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\Log\ERRORLOG'.
2015-03-25 10:34:06.84 Server      The service account is 'NT Service\MSSQL$MSSQLSERVER'. This is an informational message; no user action is required.
2015-03-25 10:34:06.84 Server      Registry startup parameters:
-d C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\DATA\master.mdf
-e C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\Log\ERRORLOG
-l C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\DATA\mastlog.ldf
2015-03-25 10:34:06.84 Server      Command Line Startup Parameters:
-s "MSSQLSERVER"
2015-03-25 10:34:07.15 Server      Error: 17051, Severity: 16, State: 1.
2015-03-25 10:34:07.15 Server      SQL Server evaluation period has expired.

On the other hand, if client tools are expired, then we would have trouble opening SQL Server Management Studio and other client tools. Here is the pop-up which would be received.

Evaluation period has expired. For information on how to upgrade your evaluation software, please go to http://www.microsoft.com/sql/howtobuy

sql expired 01 SQL SERVER   Evaluation Period Has Expired   How to Activate SQL Server?

In such situations, we need to perform an activation of SQL Server. This is called as edition upgrade in SQL Server installation terminology. To upgrade, you need to have installation media of the destination version. SQL Server evaluation edition can be upgraded to standard, developer, enterprise editions. The complete upgrade metrics can be found in SQL Server books online https://msdn.microsoft.com/en-us/library/ms143393(v=sql.120).aspx

Here are the steps:

  1. From SQL Server installation media, launch setup.exe.
  2. From the SQL Server Installation Center click Maintenance, and then select Edition Upgrade

sql expired 02 SQL SERVER   Evaluation Period Has Expired   How to Activate SQL Server?

  1. It would launch Wizard where you need to keep on pressing next, next and finally “Upgrade”. Make sure that edition shown in the license terms screen is what you purchased.

Hope this would help.

Reference : Pinal Dave (http://blog.SQLAuthority.com)

Interview Question of the Week #025 – How to Implement Paging in SQL Server?

Here is a very popular question when SQL Server interview consists of live coding example.

Question: How to implement paging in SQL Server? Explain with the help of script.

Answer: Here is the script to implemented paging in SQL Server 2005.

USE AdventureWorks
GO
DECLARE  @StartRow INT
DECLARE  
@EndRow INT
SET    
@StartRow = 120
SET @EndRow = 140
SELECT    FirstName, LastName, EmailAddress
FROM (
SELECT    PC.FirstName, PC.LastName, PC.EmailAddress,
ROW_NUMBER() OVER(
ORDER BY PC.FirstName, PC.LastName,PC.ContactID) AS RowNumber
FROM    Person.Contact PC) PersonContact
WHERE    RowNumber > @StartRow
AND RowNumber < @EndRow
ORDER BY FirstName, LastName, EmailAddress
GO

Above query is using ROW_NUMBER () function, but you can also use CTE to accomplish the same. Here is the script for the same:

USE AdventureWorks
GO
DECLARE @StartRow INT
DECLARE
@EndRow INT
SET
@StartRow = 120;
SET @EndRow = 140;
WITH PersonContact AS
(
SELECT PC.FirstName, PC.LastName, PC.EmailAddress,
ROW_NUMBER() OVER(
ORDER BY PC.FirstName, PC.LastName,PC.ContactID) AS RowNumber
FROM Person.Contact PC)
SELECT FirstName, LastName, EmailAddress
FROM PersonContact
WHERE RowNumber > @StartRow
AND RowNumber < @EndRow
ORDER BY FirstName, LastName, EmailAddress
GO

You can read more about this in following blog posts:

Reference : Pinal Dave (http://blog.SQLAuthority.com)

SQL SERVER – Using PowerShell and Native Client to run queries in SQL Server

I have been getting my hands dirty working with PowerShell commands for a fun learning experience and few have made to this blog. The after effect of this is that I have been pinged for few queries on them too. Recently I got a call from one of my friend and he wanted to use PowerShell to connect to SQL and run some queries. Initially I thought that he is talking about SQL PowerShell, but his intention was to connect to SQL using SQL Server Native Client and run the query. That was strange but I was ready and up for the challenge in front of me. The curiosity is just too much that I couldn’t sleep till I found a solution to this challenge he put to me.

I wrote a small sample and sharing here for your benefit. First, we need to create a sample database and a table which would be used by PowerShell.

CREATE DATABASE SQLAuthority
GO
USE SQLAuthority
GO
CREATE TABLE EmployeeMaster (
EmployeeID INT
,FirstName VARCHAR(10)
,
LastName VARCHAR(10)
)
GO

Now, our task is to use PowerShell and insert data into EmployeeMaster. Here is the step by step PowerShell script.

# Make Connecting string to the SQL Server Instance and the Database
$con = New-Object System.Data.SqlClient.SqlConnection("Data Source=.; Initial Catalog=SQLAuthority; Integrated Security=SSPI")
# Define variables and the values to be inserted
$EmployeeID = 1
$FirstName = 'Pinal'
$LastName = 'Dave'
# Open database connection
$con.Open()
#Create SQL Insert Statement for table EmployeeMaster with the values
$stmt_insert = "INSERT INTO [EmployeeMaster]
([EmployeeID],[FirstName],[LastName])
VALUES($EmployeeID ,'$FirstName' ,'$LastName')"
# Create command to execute to SQL connection
$cmd = $con.CreateCommand()
$cmd.CommandText = $stmt_insert
# Invoke the Insert statement
$cmd.ExecuteNonQuery()

Once we execute above, the data would be inserted into the table. Now, to fetch the data, I am going to use invoke-SQLCMD cmdlet

Add-PSSnapin SqlServerCmdletSnapin100
$DataSet = Invoke-Sqlcmd -Database "SQLAuthority" -Query "SELECT * FROM [EmployeeMaster]" -ServerInstance .
foreach ($element in $DataSet)
{
$element
}
# Close database Connection
$con.Close()

We can run all the statements together in one PowerShell script and use Windows PowerShell ISE to execute it.

sql ps sample 01 SQL SERVER   Using PowerShell and Native Client to run queries in SQL Server

You might get a few errors while using Invoke-SQLCMD which I have explained here

SQL SERVER – FIX – The term ‘Invoke-Sqlcmd’ is not recognized as the name of a cmdlet

Hope this would give you a head start of using PowerShell to connect to SQL and run queries. This was a cool experiment for me and I thought was worth sharing. Have you ever had such requirements? Do share your experience over comments below.

Reference : Pinal Dave (http://blog.SQLAuthority.com)

SQL SERVER – Database Snapshots and Replica Causing Write Delays

For folks who have not worked with SQL Server Database snapshot feature, please refer to the blog I wrote sometime back (SQL SERVER – 2008 – Introduction to Snapshot Database – Restore From Snapshot). I am giving this reference because some might not be aware of this powerful capability with SQL Server.

In the above article, one of the readers did ask if there is any performance overhead of having multiple Database snapshots with SQL Server.  Well, database snapshots are built on the principle of COW (Copy On Write) where a database page is copied to the snapshots when they are written in the primary database. That is the whole idea of Database snapshots. Hence, if you have multiple database snapshots configured, for each write on the primary DB, we will be indirectly getting “n” more write based on the number of snapshots made on the DB. It is as simple as that.

In one of my customer locations, they were getting heavy performance issue on their writers and when I got an option to inspect, I found there were close to 6 database snapshots that were created in that database. When I asked, they said it is some sort of backup copy so that they can come back onto that version. Being a heavy transactional system, this showed up easily when the load was high.

The immediate question was, how can I identify we have far too many Snapshots created on the databases? I suggested a simple script so that they can start looking into it from time to time:

IF EXISTS( SELECT source_database_id
FROM sys.databases
WHERE source_database_id IS NOT NULL
GROUP BY source_database_id
HAVING COUNT(*) > 1)
AND EXISTS(
SELECT waiting_tasks_count FROM sys.dm_os_wait_stats
WHERE wait_type = 'replica_writes' AND waiting_tasks_count > 0)
BEGIN
SELECT
DB_NAME(source_database_id) AS 'SourceDB',
COUNT(*) AS 'NumSnapshots'
FROM sys.databases
WHERE source_database_id IS NOT NULL
GROUP BY DB_NAME(source_database_id)
HAVING COUNT(*) > 1
SELECT * FROM sys.dm_os_wait_stats
WHERE wait_type = 'replica_writes' AND waiting_tasks_count > 0
END

In their environment this showed up, having 6 snapshots for their main transactional database and they were able to delete some of them based on age.

Our database systems are sensitive to such mistakes and can cause us huge performance overhead. The development team thought that the system was deteriorating over a period of time and were taking this performance overhead without analyzing what could be the problem. I am a firm believer that one should always know the root cause for a problem rather than just solving the problem superficially.

Do let me know if you use Snapshots actively in your environments and have you faced such issues in your SQL Server box? One of the key places where I have seen people use Database Snapshots is when Mirroring is configured and we want to query from our Mirror Server. Are there other places where you have relied on Database Snapshots? Let me know via your comments.

Reference : Pinal Dave (http://blog.SQLAuthority.com)

SQL SERVER – Converting Standard SSMS File Results to Delimited – Notes from the Field #085

[Note from Pinal]: This is an 85th episode of Notes from the Field series. SSMS is my favorite IDE. I work with many different databases and different IDE. However, I get extreme comfort when I am working with SSMS. Though I like SSMS does not mean it is the best tool and it can accomplish all the tasks when I want it to do. Here is one of the task – converting file results to comma delimited result. However, my friend Kevin came in the rescue. He helped me out in this situation.

Let us read how we can achieve this task of converting standard SSMS file results to Delimited resultset.


KevinHazzard SQL SERVER   Converting Standard SSMS File Results to Delimited   Notes from the Field #085

Now and again, clients send me files to load into the database that they’ve generated using the Results To File feature in SQL Server Management Studio (SSMS). If you’re unfamiliar with this feature, let me explain it. As shown in Figure 1, after right-clicking in a query window in SSMS, you may select the Results to File option from the context menu. With this option enabled, the next time the query is run, a dialog box will appear allowing you to choose the generated file’s location.

notes85 1 SQL SERVER   Converting Standard SSMS File Results to Delimited   Notes from the Field #085

Figure 1 – An easy way to send the results of a query in SSMS to a file.

By default, the file that’s emitted will be column-aligned otherwise known as fixed-width columns. The first row will contain the column names and a second row will have groups of hyphen characters acting as a sort of text-based underscoring. At the end of the file, a count of the emitted rows will also be included. All of these features can be seen in Figure 2, where I’ve hidden the middle rows so you can see the beginning and the end of the file.

notes85 2 SQL SERVER   Converting Standard SSMS File Results to Delimited   Notes from the Field #085

Figure 2 – A standard, column-aligned result file with trailing row count.

When my clients ask me to load these types of files using SQL Server Integration Services (SSIS), there are a few problems to overcome. First of all, fixed-width data is not simple to load. The column specification of the flat-file source has to be updated manually to include the exact width of each column which can be time-consuming and prone to errors. Secondly, the row count at the end of the file isn’t data that can be loaded so it will cause an exception or load incorrectly, forcing us to add special handling logic in the package for that. Lastly, the literal word NULL appearing throughout the file whenever there is a missing value doesn’t really convey the absence of data. Those will also have to be handled with custom code.

To address the first problem, I sometimes ask my clients to re-run their queries setting the output format to comma or tab delimited. In SSMS, this can be done by right-clicking in the query window, selecting the Query Options… item from the context menu and choosing a different Output format in the Results / Text section of the dialog box. The second problem can be solved by adding the SET NOCOUNT ON directive to the start of the query that generated the file. I may also ask my clients to include that change. The third problem, where SSMS emits literal NULL strings whenever values are missing, can also be handled with the use of ISNULL or COALESCE to do NULL detection in the original query. This is yet one more change I must ask my clients to make.

As an experienced consultant, I understand that asking my clients to make so many modifications to their workflow is often not worth their time and trouble. That’s what they hire me for. It’s best to find ways to make whatever my clients provide work for them, especially if it can be automated. To that end, I wrote a C# function called ConvertColumnAlignedFileToDelimitedFile that helps to overcome all of these problems.


public static int ConvertColumnAlignedFileToDelimitedFile(
stringsourcePath, string targetPath, string delimiter = "\t")
{
intlineNdx = 0;
using (StreamWriter writer = File.CreateText(targetPath))
{
string header = null;
int[] columnWidths = null;
foreach (string line in File.ReadLines(sourcePath, Encoding.UTF8))
{
if (lineNdx == 0)
header = line; // save the header for subsequent processing
else if (lineNdx == 1)
columnWidths = ProcessSeparatorRow(line, header,
delimiter, writer);
else
{
// stop processing on an empty line
if (line.Length == 0) break;
ProcessDataRow(line, columnWidths, delimiter, writer);
}
lineNdx++;
}
}
returnlineNdx - 2;
}

Figure 3 – A function that converts a column-aligned file to a delimited file.

The function takes the name of a source file, a target file and the delimiter that will be inserted between values. There’s no magic here but there is an interesting trick that takes advantage of some metadata lurking in the column-aligned output file. Look at Figure 2 again. That second row containing hyphen characters actually uncovers a key constraint that we otherwise wouldn’t know: the maximum length of each column. Each block of hyphens is separated by a space so if we count the length of each hyphen group, we’ll know how to parse the entire file. I’ve provided a helper function called ProcessSeparatorRow that reads the metadata buried in the hyphen groups, writes out the header row and returns the column widths to the calling function.


</div>
private static int[] ProcessSeparatorRow(string line,
string header, string delimiter, StreamWriter writer)
{
string[] columns = line.Split(' ');
int[] columnWidths = new int[columns.Length];
for (int columnNdx = 0, startPostion = 0;
columnNdx < columnWidths.Length; columnNdx++)
{
columnWidths[columnNdx] = columns[columnNdx].Length;
int length =
(startPostion + columnWidths[columnNdx] <= header.Length)
? columnWidths[columnNdx]
: header.Length - startPostion;
string name = header.Substring(startPostion, length).Trim();
startPostion += columnWidths[columnNdx] + 1;
writer.Write(name);
writer.Write((columnNdx < columnWidths.Length - 1)
? delimiter : Environment.NewLine);
}
return columnWidths;
}

Figure 4 – A function that processes the metadata in the separator row.

For the data rows beyond the header and the separator, when an empty line is encountered, the processing just stops. That will handle the trailing row count problem discussed earlier. Writing out the non-empty rows in the remainder of the file is straightforward with the helper function called ProcessDataRow shown in Figure 5.


private static void ProcessDataRow(string line,
int[] columnWidths, string delimiter, StreamWriter writer)
{
for (int columnNdx = 0, startPosition = 0;
columnNdx < columnWidths.Length; columnNdx++)
{
int length =
(startPosition + columnWidths[columnNdx] <= line.Length)
? columnWidths[columnNdx]
: line.Length - startPosition;
string value = line.Substring(startPosition, length).Trim();
if (value.Equals("NULL", StringComparison.InvariantCultureIgnoreCase))
value = String.Empty;
startPosition += columnWidths[columnNdx] + 1;
writer.Write(value);
writer.Write((columnNdx < columnWidths.Length - 1)
? delimiter : Environment.NewLine);
}
}

Figure 5 – Process a normal row of data by writing it out using the specified delimiter.

Running the conversion function with the output.rpt sample file shown in Figure 2 produces the tab-delimited file shown in Figure 6. The tab characters appear as orange-colored arrows in the output. Notice also that wherever the word NULL appeared in the original file, it has been converted to an empty string in the output. A couple of lines of C# code in the ProcessDataRow function that handled that quite nicely.

notes85 3 SQL SERVER   Converting Standard SSMS File Results to Delimited   Notes from the Field #085

Figure 6 – The converted file with tab delimiters and detected NULLs.

In closing, let me say that while the C# code shown in this article is somewhat stand-alone, it can easily be put into a Script Task or a Script Component in SSIS. I often use code like this within SSIS to do file preparation or validation before loading the data into staging tables. However, having a command-line utility to convert SSMS results files into delimited files with NULL detection is also quite handy. I’ll even admit that I’ve invoked such a utility as a job step in SQL Agent from time to time. Remember: the job of a good consultant is to make the difficult seem both easy and repeatable. Hopefully, you’ll find this code useful in a variety of situations, too.

If you want to get started with SQL Server with the help of experts, read more over at Fix Your SQL Server.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Stress free life with Spotlight for SQL Server DBA

Being on the cutting edge of technology is something that we need to be prepared as IT professionals. Newer capabilities get into the software almost every single day and it is difficult to keep track of what is getting added into the software’s we work with. I am a firm believer that we need to innovate and keep learning every single day. That is the best way we can be competitive and make our lives easier.

I get an opportunity to speak at multiple conferences. Though the presentations range from short tips to full-fledged presentations for hours. More than the presentation time, it is the process of preparing and adding a new concept in every session is something I cherish a lot. Most of my writings are based on what people ask me from time-to-time.

In a recent conversation I had with a junior DBA, it taught me something interesting that I thought was worth a share. The DBA had emailed me the below screen shot:

 SQL SERVER   Stress free life with Spotlight for SQL Server DBA

He was complaining that he was not able to take the Transaction Log backup, but it was encountering an error. I said, most of the times it is important to watch the error message because in the recent past, I have seen the error messages give us the solution too. These errors are simple and a little search on MSDN can give the answer. Having said that, I did talk about other things that involve around the error. To get started, I would like to list all the backup types we can do with SQL Server:

  • Full backups
  • Differential backups
  • File backups
  • Filegroup backups
  • Partial backups
  • Copy-Only backups
  • Mirror backups
  • Transaction log backups

We are not going to look at each of the types in this blog, but this was important to mention all of them for a future reference.

Find when the last backup done

I wrote back to the DBA stating there is something bigger problem in hand. I always recommend as a seasoned DBA, you need to know when the last backup was taken on the server. This becomes useful to restore the same incase of any disaster:

SELECT sdb.Name AS DatabaseName,
COALESCE(CONVERT(VARCHAR(12), MAX(bus.backup_finish_date), 101),'-') AS LastBackUpTime
FROM sys.sysdatabases sdb
LEFT OUTER JOIN msdb.dbo.backupset bus
ON bus.database_name = sdb.name
GROUP BY sdb.Name

As a contingency plan, we need to be always aware when a backup was taken. It is critical that as a DBA, all critical production databases are monitored from time to time.

In the above error, even though we are on a FULL recovery model – if we try to take a TLog backup without a FULL backup, the error is encountered. So to mitigate this, we need to be taking a FULL backup before the TLog backups can be initiated. The concept is simple right.

SQL Server Dashboard with Spotlight

Ask a naval officer about how a command center looks and how critical it is for them? A better view of the battle field and the strategies for the field is always decided in the command center. In a similar way, a DBA can use Spotlight Dashboard to get a high level understanding of what happens on his servers.

 SQL SERVER   Stress free life with Spotlight for SQL Server DBA

In our example, you can see that our servers don’t have a backup being taken till date. Incase the FULL backup is not taken from our databases over, this alert is being raised. The same thing is shown in a visual form to in the tiles view.

If you ever get a chance to play around with the Spotlight dashboard, I would highly recommend watching some of the out-of-box warnings they arise and take corrective actions immediately.

Note: A database even in the FULL recovery model will act like SIMPLE recovery model till the first FULL backup is taken.

So this error / behavior can get any seasoned DBA into trouble. The Dashboard is a big time savior and can save your job if gone un-noticed. If you want to lead a stress free DBA life, then such early warnings can save you big time.

Final Words

As we wrap up, sometimes we are likely to miss some of the basic principles that we assume from the system. Having something that can remind us of these basics can get us out of trouble in many ways. Most the tools we work give us early warnings and recommendations. It is up to a DBA to look at them and take necessary actions. If you would like to try out these tools, then you can use the SQL Server Evaluation Version to try them out.

Click here for a free 30 day trial of Spotlight on SQL Server

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – SPARSE Property Doesn’t Carry Forward Using SELECT INTO

During one of the recent posts on a SQLBangalore UG FB page, one of the members asked an interestingly simple question. He was asking why is it that when you use SELECT-INTO to create a new table the triggers are not getting carried forward as part of the definition. A number of folks answered and it was a great learning experience for me personally. This blog was inspired from that post and I thought to add a different spin to it.

Though the basic premise of Triggers etc. are written on the FB page, let me ask you a simple question. Though using SELECT-INTO bring forward most of the attributes of the columns during the definition. It is not always true that the definitions will move forward.

One of the outlets that I found recently is the definition of the columns as SPARSE. One person asked that I was writing a lot of blogs around SPARSE columns and if the same will be taken forward while doing a SELECT INTO. The best to learn this is by testing in our environment, here is the script I have used:

-- Create a table and insert a row
CREATE TABLE tbl_host (Age SMALLINT sparse, Name VARCHAR(25));
INSERT INTO tbl_host VALUES (24, 'SQLAuthority');
-- Create a new table from the original
SELECT * INTO tbl_Copy FROM tbl_host;
-- See if the sparse property was carried over
SELECT OBJECT_NAME(OBJECT_ID) AS tabname,
name AS colname,
column_id,
is_sparse
FROM sys.columns
WHERE OBJECT_ID = OBJECT_ID( 'tbl_host') OR OBJECT_ID = OBJECT_ID( 'tbl_Copy');
-- Clean up time
DROP TABLE tbl_host
DROP TABLE tbl_Copy

The output of the above script would look like:

SPARSE SELECT INTO 01 SQL SERVER   SPARSE Property Doesn’t Carry Forward Using SELECT INTO

From this it is evident that the SPARSE property is not carried forward when we use SELECT-INTO to create a new table. I would love to learn from you folks on other things that don’t carry forward when we use SELECT-INTO. Tell me what scenario’s you were surprised and learnt something new while using the SELECT-INTO clause. Use the comments section to share your knowledge.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Startup Problem – FIX – Error 5118, Severity 16, State 1

I have been monitoring some of the error messages that SQL Server has been throwing in the recent versions and they are self-explanatory. If we understand the English, then half the problem is solved. It becomes easier and quicker for an administrator if they ever encounter such errors. This blog is an inspiration to one such error which one of my readers found it difficult to understand because he was just hired from college.

My blog reader contacted me via comments and told that he is not able to start SQL Service. Without any delay, I have asked to share the error message. He told that he is stating it from Services.msc and getting an error:

startup 01 SQL SERVER   Startup Problem – FIX    Error    5118, Severity 16, State 1
—————————
Services
—————————
Windows could not start the SQL Server (SQL2014) on Local Computer. For more information, review the System Event Log. If this is a non-Microsoft service, contact the service vendor, and refer to service-specific error code 3417.
—————————
OK
—————————

I told that this is not an error which can help much and I asked to share SQL Server ERRORLOG. Since he was not a SQL person, he was not aware of that file. I shared below blog:

SQL SERVER – Where is ERRORLOG? Various Ways to Find its Location

When I looked into the ERRORLOG, found below messages.

2015-06-10 20:54:44.00 spid7s      Starting up database ‘master’.
2015-06-10 20:54:44.00 spid7s      Error: 5118, Severity: 16, State: 1.
2015-06-10 20:54:44.00 spid7s      The file “C:\Program Files\Microsoft SQL Server\MSSQL12.SQL2014\MSSQL\DATA\master.mdf” is compressed but does not reside in a read-only database or filegroup. The file must be decompressed.

I asked the question – Did you make some changes with the file master.mdf? He said NO so I asked to share screenshot.

startup 02 SQL SERVER   Startup Problem – FIX    Error    5118, Severity 16, State 1

Blue color of the file indicates that files are compressed and that’s the precise error, we are seeing in the ERRORLOG also. I asked him to un-compress the file using below option.

startup 03 SQL SERVER   Startup Problem – FIX    Error    5118, Severity 16, State 1

After that he was able to start SQL Service. I asked him the changes which are done and he told that due to disk space issues, complete C:\ drive was compressed.

Have you ever enabled compression on SQL machines on the folder which has SQL database files? Never do that! Have you ever encountered these errors? Let me know some of the reasons you did so and faced this issue.

Reference: Pinal Dave (http://blog.sqlauthority.com)