SQL SERVER – System.IO.FileNotFoundException: Could not load file or assembly

In a recent R&D operation, I was trying to use some SQLCLR assembly as part of the installation process. It was not easy but it got me into a mess that I thought was worth debugging. One of the interesting issues which I faced recently was to deploy a .net assembly which was failing with an error. I thought the third party .msi and the .dlls they shared must have gone wrong. I wanted to understand what was going wrong here to get a better feeler. The Error mentioned as part of the stack was:

Error: System.IO.FileNotFoundException: Could not load file or assembly ‘Microsoft.SqlServer.BatchParserClient, Version=12.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91’ or one of its dependencies. The system cannot find the file specified.
File name: ‘Microsoft.SqlServer.BatchParserClient, Version=12.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91’
at System.Reflection.RuntimeAssembly._nLoad(AssemblyName fileName, String codeBase, Evidence assemblySecurity, RuntimeAssembly locationHint, StackCrawlMark& stackMark, IntPtr pPrivHostBinder, Boolean throwOnFileNotFound, Boolean forIntrospection, Boolean suppressSecurityChecks)
at System.Reflection.RuntimeAssembly.InternalLoadAssemblyName(AssemblyName assemblyRef, Evidence assemblySecurity, RuntimeAssembly reqAssembly, StackCrawlMark& stackMark, IntPtr pPrivHostBinder, Boolean throwOnFileNotFound, Boolean forIntrospection, Boolean suppressSecurityChecks)
at System.Reflection.RuntimeAssembly.InternalLoad(String assemblyString, Evidence assemblySecurity, StackCrawlMark& stackMark, IntPtr pPrivHostBinder, Boolean forIntrospection)
at System.Reflection.RuntimeAssembly.InternalLoad(String assemblyString, Evidence assemblySecurity, StackCrawlMark& stackMark, Boolean forIntrospection)
at System.Reflection.Assembly.Load(String assemblyString)
at Microsoft.SqlServer.Management.Common.ServerConnection.GetStatements(String query, ExecutionTypes executionType, Int32& statementsToReverse)
at Microsoft.SqlServer.Management.Common.ServerConnection.ExecuteNonQuery(String sqlCommand, ExecutionTypes executionType)

Whenever there are such errors, we need to look at a few things.

  1. What is the error?
  2. What is assembly name?
  3. What is the version of the assembly?

Here is what we have in error message. I have highlighted them as well.

  1. IO.FileNotFoundException: Could not load file or assembly
  2. SqlServer.BatchParserClient
  3. Version=12.0.0.0

The assemblies are stored in C:\Windows\Assembly folder and they are shown as below

assembly 01 SQL SERVER   System.IO.FileNotFoundException: Could not load file or assembly

Two things to be noted in the screenshot. Version and Processor Architecture. The screenshot is NOT from the same machine where we received the error. On the problem machine, we were not seeing the needed assembly.

I have downloaded and installed “Microsoft® SQL Server® 2014 Shared Management Objects” (SMO) from https://www.microsoft.com/en-us/download/details.aspx?id=42295 and after that I was able to get the assembly and the code worked. Microsoft SQL Server Management Objects require – Microsoft SQL Server System CLR Types, that was also available on the same page.

I am sure most of us sometimes don’t get on this path of exploration, but I felt this was a great learning for me to explore. Do let me know if you did debug anything on these lines at your environments via the comments.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – FIX: Msg 7395, Level 16, State 2 – Unable to start a nested transaction for OLE DB provider

While playing with linked server from SQL Server to SQL Server, I faced few errors and fixed them as well. It has become a custom to get errors working with Linked Servers almost every time. I personally thought it is worth blogging because I didn’t find any single blog having the cause and solution explained in detail.

Here is the query which I was running on SMALLSERVER which has linked server to BIGSERVER. It is a four part naming having format for ServerName.DatabaseName.SchemaName.ObjectName.  Basically, query is updating a value for a table called dbo.Employee in database called SQLAuthority on server BIGSERVER.

BEGIN DISTRIBUTED TRANSACTION
UPDATE
BIGSERVER.SQLAuthority.dbo.Employee
SET SALARY = 'Y'
WHERE ID = 5
COMMIT TRANSACTION

It was failing with below error message.

OLE DB provider “SQLNCLI11” for linked server “BIGSERVER” returned message “The transaction manager has disabled its support for remote/network transactions.”.
Msg 7391, Level 16, State 2, Line 2
The operation could not be performed because OLE DB provider “SQLNCLI11” for linked server “BIGSERVER” was unable to begin a distributed transaction.

Here is the important information in error message. “The transaction manager has disabled its support for remote/network transactions” which basically indicates issue with setting of DTC. To fix that, we need to go to Start > Run > dcomcnfg and open below interface (this is Windows 2012 R2 Sevrer)

dtc 01 SQL SERVER   FIX: Msg 7395, Level 16, State 2 – Unable to start a nested transaction for OLE DB provider

You can also launch the same interface from “Control Panel” > “Administrative Tools > Component Services”. Follow the screen shot and open “Properties” and go to security tab and change setting as below.

dtc 02 SQL SERVER   FIX: Msg 7395, Level 16, State 2 – Unable to start a nested transaction for OLE DB provider

Once changed, restart the DTC Service. Now, I got below error.

OLE DB provider “SQLNCLI11” for linked server “BIGSERVER” returned message “Cannot start more transactions on this session.”.
Msg 7395, Level 16, State 2, Line 3
Unable to start a nested transaction for OLE DB provider “SQLNCLI11” for linked server “BIGSERVER”. A nested transaction was required because the XACT_ABORT option was set to OFF.

Above error also tells the action needed – we need to enable the XACT_ABORT in the transaction. Here is the modified version of the query

SET XACT_ABORT ON
GO
BEGIN DISTRIBUTED TRANSACTION
UPDATE
BIGSERVER.SQLAUTHORITY.DBO.EMPLOYEE
SET SALARY = 'Y'
WHERE ID = 5
COMMIT TRANSACTION

And that worked for me and it updated the value on remote server.

You must have also encountered some errors while using linked server? Share the problem and solution via comments section.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Working with Event Viewer and PowerShell

Sometimes I go into these simple explorations that make me learn something new that gets me off the routine. SQL Server is always on the agenda and that passion will never die. This exploration was based on the fact when I was talking to someone about some of the super cool stuff people work on – I heard a DBA complain to me that he was working on a “Windows Server Core” machine and he had no UI. I was pleasantly taken aback and got back search on the internet about. I also got to know “SQL Server” can be installed on these server machines. It was fascinating for me to hear such things and I wondered, how can someone work on such machines.

I thought, let me see if we can use some sort of scripting to work on things I take for granted when troubleshooting some error messages. I always look at opening up the Windows Event Viewer to watch through the errors. Now I thought it would be great not to open up Event Viewer and yet be able to query. That was surely a challenge and I wanted to learn something new and interesting to share.

This blog is at the moment a rudimentary shot at going the scripting route using Powershell. Here are some commands that I used. Let me progress from few simple commands to little interesting, complex queries I figured out to write:

1) List the event viewer logs on a given system. This is simple because I need to know what can be queried at any point in time.

get-eventlog -list

 SQL SERVER   Working with Event Viewer and PowerShell

As you can see I have about some 27k+ messages and this is a great place to make our query.

2) Let us next try to select the newest 50 messages of application log:

Get-EventLog -LogName Application -newest 50

The next, I was wondering if we can dump on data based on a date value. This gets me to the next query that was written.

3) Gathering logs after a particular date, we can also use “before” to select the messages prior to given dates as shown below:

Get-EventLog -LogName Application -after 1/10/2016

I wanted to make the query less complex by searching on a specific Event type and Event Source. Since I work with SQL Server and the source had to be MSSQLSERVER (default instance name).

4) Selecting only the messages which are logged as “information” for a source like “MSSQLSERVER” and using a clip to basically copy the output to the clipboard:

Get-EventLog -logname application -EntryType information -newest 50 -source *MSSQLSERVER* | clip

When I was writing the above query, I was little clueless to what are valid source types we can use. So I made a query to identify the source names.

5) Find the relevant source to be used in a query:

Get-EventLog -logname "Application" 
| Select-Object Source -unique

 SQL SERVER   Working with Event Viewer and PowerShell

As you can see, if we know how to play around with Powershell and know the query we need – we can always find nice and easy way to get the data. How many SQL DBA’s who read this blog have already tried playing around with PowerShell? What have you been doing with it? Please let me know via the comments below.

Reference: Pinal Dave (http://blog.sqlauthority.com)

Interview Question of the Week #053 – What is the Difference Between Deterministic Functions and Nondeterministic Functions?

Some questions are so theoretical that I believe they really do not add too much value if users know that question or not. Here is one such question I am very confident that you agree with my point of view.

Questions: What is the Difference Between Deterministic Functions and Nondeterministic Functions?

Answer: 

Deterministic functions always return the same output result all the time it is executed for same input values. i.e. ABS, DATEDIFF, ISNULL etc.

Nondeterministic functions may return different results each time they are executed. i.e. NEWID, RAND, @@CPU_BUSY etc. Functions that call extended stored procedures are nondeterministic. User-defined functions that create side effects on the database are not recommended.

Now you have read the answer – I have a question back to you.

Did you the difference between deterministic and nondeterministic function before this blog? If no, has it ever impacted your performance in your daily job?

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – FIX: Msg 3169, Level 16, State 1 – The database was backed up on a server running version. That version is incompatible with this server.

At this rate, I think I am going to exhaust the whole error set available inside SQL Server. I am glad that I am able to get into unique situations and then resolve them too. And this is more of a diary of the error messages I am getting into. I was playing with my Demo database on SQL Server 2016. Once done, I started restore from the previous copy.  But is failed with below error:

Msg 3169, Level 16, State 1, Line 1
The database was backed up on a server running version 13.00.0801. That version is incompatible with this server, which is running version 12.00.4213. Either restore the database on a server that supports the backup, or use a backup that is compatible with this server.
Msg 3013, Level 16, State 1, Line 1
RESTORE DATABASE is terminating abnormally.

Then I realized that I have taken back-up from 13.00.0801 which is SQL Server 2016 and I was trying to restore on 12.00.4213 which is SQL Server 2014. Error message shows both versions, source and destination.

So, it’s is clear that there is no direct way to downgrade from a higher version to a lower version and it would fail with error message. Why? When an instance of SQL is upgraded, not only do the binaries for the database engine change, the schema level for the databases also changes. So it would not be possible to attach a database with a higher schema level to an instance on a lower version of SQL.

What can be done? One possible approach is to manually export and import the data. You can follow the below steps:

Create empty database

  • In Object Explorer, expand Databases, right-click a database, point to Tasks, and then click Generate Scripts. Follow the steps in the wizard to script the database objects.
  • On the Choose Objects page, select “Script entire database and database objects”.
  • On the Set Scripting Options page, select Save scripts to a specific location.
  • select the Advanced button; under “Types of data to script” select “Schema Only” and under “Script for server version” select “SQL Server 2008” (or appropriate version)
  • Click on Next and finish to complete the process.
  • Now you can use the generated script file and execute this against the SQL server 2008 instance to create the complete database schema without data.

Move the data

To move the data, you have a couple of options but all of these are a bit tricky.

  1. Script out the data exactly the way described in the above steps, just by selecting “Types of data to script” as “Data Only”.

Or

  1. Script out the data for every object individually one by one instead of complete database if you have a large database.

Or

  1. BCP to export the data into files and then again use BCP to import the data back into the tables on to the new server.

In short, moving back-up from higher to lower version is not possible. Only option we have is to move the data using the ways described above. Do you know any more ways?

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – SSMA Error: System.Data.OracleClient requires Oracle client software version 8.1.7 or greater

I work majorly on SQL Server but rarely do I get a chance to work on Oracle in general. When some of the error messages reach my inbox, I get intrigued to why some occur. This error message was an outcome of that exploration. When I got this error message, I said I have no clue to why this is happening and I asked if it was anyway related to SQL Server?

The gentleman wrote back stating this was coming from a SQL Server tool and since they were doing a migration using SQL Server Migration Assistant Tool (SSMA). Now that statement got me interested because it was coming from a SQL Server migration.

I went ahead to asked about the environment to get a better hang of things. I asked the Oracle client version they were running. They responded saying, though we are running Oracle Client 9.2.0.8 we are getting an Oracle 8.1.7 or higher error strangely.

Following message is being displayed on the console:

Migrating data…
Analyzing metadata…
Preparing table SYSTBL.CMN_TBL_CUST_…
Preparing data migration package…
Starting data migration agent…
Starting data migration…
Exception during processing: System.Data.OracleClient requires Oracle client software version 8.1.7 or greater.
Data migration complete.

0 table(s) successfully migrated.
0 table(s) partially migrated.
1 table(s) failed to migrate.

After a going through the complete problem statement. I saw that it was working on a local server, but was failing when it was called from remote server.

Reason After Investigation

The Oracle client provider is not there on the remote SQL Server. If working with remote machines, the Oracle Client provider should be present on the remote SQL Server.

I sent a mail to – Install the Oracle Client provider on the remote SQL Server and verify the data migration. As SSMA doesn’t have any limitation to work with remote SQL Server. Also as part of the installation, asked to check if the extension packs were installed on the remote servers.

After this exercise, I felt there was some good learning for me too and worth a share here. I am sure you are much more experienced in working with SSMA but feel free to let me know if I missed anything.

Reference: Pinal Dave (http://blog.sqlauthority.com)

It’s a New Year, Take Advantage of It – Notes from the Field #110

[Note from Pinal]: This is the first episode of Notes from the Field series in the year 2016 and 110th overall.

There is one place where I see the maximum crowd in the new year – fitness center (gym). It is indeed a good thing that we all want to stay fit and active. However, the reality unveils itself from the second week when the gyms are empty once again. We all start strong, but only few finish strong as well. Earlier last year when I was taking advice from Mike about staying healthy, I had narrated this scenario to Mike. I was expecting that Mike will stay that he understands my feeling we will soon start talking about something related to SQL – in reality, that did not happen. When I mentioned this to Mike, he said it is indeed a good thing and he is, suggested why not we take advantage of initial enthusiasm to set a strong base for the future and healthy lifestyle. Mike did make a great point. We kept on talking about various ways to stay healthy. When we ended our conversation, just as expected, we talked about SQL. Just like health, Mike had had a wonderful insight about the database.
mikewalsh Its a New Year, Take Advantage of It   Notes from the Field #110

In this episode of the Notes from the Field series database expert Mike Walsh presents his thoughts about why we should take advantage of the opportunity present at new year and build a stronger base for healthy databases. Trust me, you want to read it!


Us technologists, we are good at what we are good at. We excel at troubleshooting, solving complex challenges and rising to the occasion. We keep our systems up, we rush in to save the day (especially the DBAs). We ace tests, we love digging in and tuning queries. We’re good at our jobs for the most part.

Sometimes, though, some of us aren’t so great at getting things done. Maybe you suffer from that problem? I know I do. I am quick to pick up the next thing before the current thing is done. I am good at putting things on my to-do lists (in fact I actually have a lot of to-do lists in different forms in different places) but I’m not so good at checking things off of the to-do lists.

smile Its a New Year, Take Advantage of It   Notes from the Field #110I get the urgent things done, but the important and not so urgent things? They can take a back seat. Are you like that? It seems that many of us technologists are. Not all of course, but we live distraction full, device driven “NOW!” controlled lives lately.

This post is a departure from the great guest feature Pinal allows us at Linchpin People to offer. Normally we are answering a technical question, helping point you in the direction of an answer or a better way of doing something. Today, I want to slide back to Professional Development.

It’s a New Year, Take Advantage of It

Worried? Stop and do something about it :-)

On the Gregorian Calendar, this week signifies the start of a new year. You often will see New Year’s resolutions come out this time of year. People say this is finally the year they’ll start eating better, quit a habit, get in shape or do something they’ve been meaning to do.

I’m not asking you to make a resolution. But I am wondering if we can agree that there are things we all meant to do at work last year but never did. Are there services we never took care of? Improvements we left unchecked. If there are, and if you are like me there probably are, make a plan to do something about them. Not a plan that won’t get touched like my to-do lists all over the place. But start today, be persistent and let’s get it done.

This is a new year and we have an opportunity to have a reason and push to do something different for a change. Let’s take advantage of it.

 Its a New Year, Take Advantage of It   Notes from the Field #110

Some Ideas

There are a lot of categories we could look at here.

Learning –

Yes, your own growth is important and something that we end up letting get neglected if it isn’t part of your official job description and your employer/manager doesn’t sort of force you along on that path. SQL Server 2016 comes out this year.  Here’s an action plan for you – Go to this site(https://www.microsoft.com/en-us/server-cloud/products/sql-server-2016/), learn about what’s coming, download a Community Technology Preview and play with some of the features and get familiar.

Maintenance –

Especially if you are a DBA – how is your environment? When is the last time you’ve given it a health check? You don’t have to reach out to Linchpin People to have us do a WellDBA Exam – though we’d be happy to help you do that. Look at the free tools out there, spend time with your key SQL Servers – look at how they are configured and running. Are things good here? What can be done differently? What maintenance is missing? Make a list and start knocking things off as you get to them.

Plan Ahead –

How old are your servers? When’s the last time you had that conversation about data archival? What are the business’ plans this year and you can your systems handle those plans? Have a conversation with the technical teams and make sure you are ready for 2016 and beyond.

What Else?

I don’t know about you, but when I read a post like this or hear people talk about procrastination or missing to-do items – I get this feeling inside. This sort of angry, panicky voice that says “Oh yeah!! I have to do ______” are you getting that voice about anything? I am just typing about it. So what I’m going to do is stop writing this post and go get those things done. You should stop reading and start getting those things done, or delegate them, or realize they weren’t important and you should stop worrying about them. Happy 2016!

If you want to get started with performance analytics and triage of virtualized SQL Servers with the help of experts, read more over at Fix Your SQL Server.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Monitoring and Troubleshooting SQL Server Got Easy with Diagnostics Tool

Blogging has never been an easy task. With close to 9 years of non-stop learning and sharing has got me into a rhythm that I rarely miss a single day without writing one. Over these years, there is one topic area that gets most number of hits based on subject. It is invariably around the performance and troubleshooting area. Most hits on the blogs are always around these areas. Though I see this is proven and become mature over years, still people are searching for topics that are common and even proven over the years.

This always baffles me on the access pattern and I wondered what could be done to mitigate such queries from budding new age DBAs. At the recent SQLPass conference I met a number of them who came to me asking how to become an effective DBA. I generally try to understand their patterns and look for solutions. One of the DBAs told me he was under constant pressure because the SQL Server used to run fine start of the week and the end users complain that the servers are unresponsive or the performance is not acceptable over the week. He was getting clueless to why this behavior was happening. He wanted some help here and I was clueless what to suggest.

Getting started with SQL Diagnostic Manager 10.0

I was recently trying to play with a number of tools in the market and I bumped into IDERA’s SQL Diagnostic Manager tool. They were releasing the new version and I wanted to see how it was different. The first thing that struck me was the ability for DBAs and developers to create automated baselines. This was the first thing that got me by surprise because this requirement was fresh on my mind from the recent trip.

After installation, get to the properties page of the server and go to the “Baseline Configuration” tab. This brings the ability to set automated backups through the week. In the below case, I have gone ahead a created through the week from 8AM – 6PM. This is exactly the timeframe the production is at the maximum stress levels.

 SQL SERVER   Monitoring and Troubleshooting SQL Server Got Easy with Diagnostics Tool

What to monitor?

The next phase is to know what we are going to monitor. I saw the “Analysis Configuration” section has the settings already available with pre-set configuration. In the example below, I have used the “Online Transaction Processing” and selected some of the readily available “Categories”. I personally felt such jump start configurations can surely help and guide the new DBAs who are clueless about what is happening in the system.

 SQL SERVER   Monitoring and Troubleshooting SQL Server Got Easy with Diagnostics Tool

What I would have loved to see few tooltips while I select each of the categories. Because it would become easy before doing any selection. Having said that, it was pretty easy for me to make the selections as it was almost self explanatory.

While selecting the “Advanced settings”, I was able to see further settings that were interesting. I was able to see some of the rules that can be run by default or can be blocked for analysis. Some of these rules are pretty advanced and interesting because it can be easily missed out in the normal run. Here some of the tempdb rules caught my eyes and I have shown below for reference.

 SQL SERVER   Monitoring and Troubleshooting SQL Server Got Easy with Diagnostics Tool

If that was not enough, in “Filter Settings” we can remove unnecessary databases or add multiple databases of our choice. This is powerful because in servers that are consolidated, we don’t need to worry about analyzing smaller databases but the databases of interest based on application can be worked out.

 SQL SERVER   Monitoring and Troubleshooting SQL Server Got Easy with Diagnostics Tool

I just wish this had some way to customize some of the rules rather than being fixed. But the 100’s of rules already available are pretty exhaustive to start. I am sure an advanced user will piggyback on some of them. Cases like number of VLFs created are different between versions and sometimes obsolete in newer versions.

Web is the new SSMS

Though the desktop version is powerful and has tons of reports to show by default, I think the web interface has improved by leaps and bound in this release for SQL Diagnostic Manager. The web interface to the monitoring server can reveal insights remotely too without installation requirement. The initial install of 220+MB file and configuration can sometimes be daunting and once configured, we can use the friendly web interface to visualize what is happening on the server.

Below is a typical workload that you see that my server suddenly peaked in the late evening when a batch service got initiated. This is typically how people would see their server’s vital stats remotely to check if something is going wrong.

 SQL SERVER   Monitoring and Troubleshooting SQL Server Got Easy with Diagnostics Tool

When such anomalies are detected, we can get into the “Top N” queries to see what is going wrong. Which databases are causing heavy workload. Which queries are consuming lots of reads / writes in the system. I felt this snapshot was very useful and made a lot of sense.

 SQL SERVER   Monitoring and Troubleshooting SQL Server Got Easy with Diagnostics Tool

Attention to detail like – “When the last autogrowth happened” is something that I felt was quite useful. From the above, I can also see that a lot of tempdb activity is currently in the system that makes me understand how the developers have written their code and where I need to do the optimizations.

Self-Service the new silver-bullet

Though I was planning  to wrap up this blog, I found a unique customization feature that caught my eyes and needed a mention here. You can build your very own dashboards using the metrics that have been collected before. The usual performance counters are already available from the web version and building your own dashboard with widgets configuration. As you can see I am taking something simple as a dashboard, but these can show the health of multiple instances across the network based on the configuration.

 SQL SERVER   Monitoring and Troubleshooting SQL Server Got Easy with Diagnostics Tool

This ability to have a central web view to all the servers a DBA needs to monitor is critical in large installations and setups where DBAs need to monitor at least 8-10 servers at any moment of time.

 SQL SERVER   Monitoring and Troubleshooting SQL Server Got Easy with Diagnostics Tool

Above is a classic example of how I have created my own SQLAuthority dashboard based on the servers that are of interest to me. I am sure your views are going to be complex and complete.

As I conclude, I will let you try the analysis tool which gives some ready solutions to some of the most common problems. There are many more options that are getting added. As I started with the baseline, let me see how the baselines have been performing over the week using their “Baseline Visualizer”.

 SQL SERVER   Monitoring and Troubleshooting SQL Server Got Easy with Diagnostics Tool

I am sure you will find some of these interesting and will share your experience as you play around with the tool. Do let me know via comments on what you found interesting.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – FIX: sp_rename error Msg 15225 – No item by the name of ‘%s’ could be found in the current database

Few days back I have written blog containing script for removing space in column name using sp_rename. In case you missed, I am listing them below for quick reference.

SQL SERVER – Script: Remove Spaces in Column Name in All Tables

SQL SERVER – How to Rename a Column Name or Table Name

While playing with sp_rename, I came across an interesting error message and I looked into the code of sp_rename to understand the issue. Here is the error message which I received.

Msg 15225, Level 11, State 1, Procedure sp_rename, Line 387
No item by the name of ‘TableTwo’ could be found in the current database ‘SQLAuthority’, given that @itemtype was input as ‘(null)’.

For simplification, I have made up below script to explain the error.

CREATE DATABASE SQLAuthority
GO
USE SQLAuthority
GO
CREATE SCHEMA HR
GO
CREATE TABLE dbo.TableOne (ID INT PRIMARY KEY)
GO
CREATE TABLE HR.TableTwo (ID INT, FName VARCHAR(100))
GO
sp_rename 'TableOne', 'TableOne_renamed' -- This works
GO
sp_rename 'TableTwo','TableTwo_renamed' -- This fails

If we look at sp_helptext sp_rename we would see below at line 387 (as shown in error)

-- was the original name valid given this type?
IF (@objtype IN ('object','userdatatype') AND @CountNumNodes > 3)
BEGIN
COMMIT TRANSACTION
RAISERROR
(15225,-1,-1,@objname, @CurrentDb, @objtypeIN)
RETURN 1
END

I looked at beginning of sp_rename the stored procedure and found some documentation there, there were some good rules listed.

  • To rename a table, the @objname (meaning OldName) parm can be passed in totally unqualified or fully qualified.
  • The SA or DBO can rename objects owned by lesser users, without the need for SetUser.
  • The Owner portion of a qualified name can usually be passed in in the omitted form (as in MyDb..MyTab or MyTab). The typical exception is when the SA/DBO is trying to rename a table where the @objname is present twice in sysobjects as a table owned only by two different lesser users; requiring an explicit owner qualifier in @objname.
  • An unspecified Owner qualifier will default to the current user if doing so will either resolve what would otherwise be an ambiguity within @objtype, or will result in exactly one match.
  • If Database is part of the qualified @objname, then it must match the current database. The @newname parm can never be qualified.
  • Here are the valid @objtype values. They correspond to system tables which track each type: ‘column’  ‘database’  ‘index’  ‘object’  ‘userdatatype’  ‘statistics’
  • The @objtype parm is sometimes required. It is always required for databases.  It is required whenever ambiguities would otherwise exist.  Explicit use of @objtype is always encouraged.

Parms can use quoted_identifiers.  For example: Execute sp_rename ‘amy.”his table”‘,'”her table”‘,’object’

So, we are getting error because our table TableTwo is not in the default schema which is dbo. The solution is very simple: we need to qualify the name of the table with the schema name as shown below.

sp_rename 'HR.TableTwo','TableTwo_renamed' -- Now This would work

While reading the ruled listed in stored procedure, I realized that same error can also come if there is a dot in table name. we need to use [] around such tables. Here is an example

USE SQLAuthority
GO
CREATE TABLE [Name.With.dot] ( i INT)
GO
sp_rename 'Name.With.dot', 'New_Name.With.dot' -- would fail
GO
sp_rename '[Name.With.dot]', 'New_Name.With.dot' -- would work
GO

Hope this would help. Do let me know if you have used something like this before in your environments. Feel free to share via comments below.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Someone was trying to hack my SQL Server Logins in Azure!

When I started to work with SQL Server on Azure VMs, I was completely excited because now the need to build local VMs has become super easy. If I need something for testing, then I know I can get a SQL Server instance in a jiffy and it can be brought down as soon as my work is done. The more we start working with softwares on the cloud, it is important to also understand the nuances of security that we need to be aware. I recently noticed the ERRORLOG on my virtual machine sitting in Azure and was surprise to see below error in ERRORLOG multiple times every minute.

Error: 18456, Severity: 14, State: 5.
Login failed for user ‘KISAdmin’. Reason: Could not find a login matching the name provided. [CLIENT: xx.174.233.xx]
Error: 18456, Severity: 14, State: 7.
Login failed for user ‘sa’. Reason: An error occurred while evaluating the password. [CLIENT: xx.174.233.xx]
Error: 18456, Severity: 14, State: 5.
Login failed for user ‘KHB’. Reason: Could not find a login matching the name provided. [CLIENT: xx.174.233.xx]
Error: 18456, Severity: 14, State: 5.
Login failed for user ‘Chred1433’. Reason: Could not find a login matching the name provided. [CLIENT: xx.174.233.xx]

(I have masked the IPs in error messages above)

I realized that

  1. The IP address listed was not an IP which I know
  2. Account shown in login failed messages doesn’t exist on my SQL server instance.

Since this is a virtual machine in Microsoft Azure, I checked the endpoints for this VM as they are responsible for any entry to the sever from the outside world. Here is what I saw.

hack 01 SQL SERVER   Someone was trying to hack my SQL Server Logins in Azure!

As we can see above that I have created endpoint for SQL Server on port 1433 and it’s a very well-known port used by SQL Server. My theory about hacking became more firm when I attempted to do remote desktop on the IP given in the error message.

hack 02 SQL SERVER   Someone was trying to hack my SQL Server Logins in Azure!

Here are my suggestions:

  1. If Login failed message says <local machine> it is less likely to be hacker, but some process somewhere on the machine.
  2. Above is also true if the IP address is within your organization.
  3. If the IP Address is outside your organization, then answer is simple – don’t expose your SQL Server on the Internet. Else you may want to take below steps
    1. Change SA password and make sure its complex.
    2. If you don’t need SQL Authentication, don’t use it. Change the mode to Windows only.

In my case, since its and VM in Azure and I have to stay with SQL authentication so I changed the public port to something different and then there were no more Login failed attempts from that machine. When I searched, I also found article from Microsoft as below

https://azure.microsoft.com/en-us/documentation/articles/virtual-machines-sql-server-security-considerations/

Interesting my simple idea of using non-default endpoint of not listed there but they do ask to use ACL for endpoints for better security of SQL running on Azure virtual machine.

Do you check SQL ERRORLOG regularly to monitor the server’s security?

Reference: Pinal Dave (http://blog.sqlauthority.com)