Developer – 3 Tips Every SQL Expert Needs to Know to Land the Perfect Job (Part 2 of 3)

I am very fortunate that I know John Sonmez in person. He is one of those individuals who understands the young mind of developers and depth of software development industry. I am a regular follower of his books, videos and ideas. I have personally implemented many of his ideas in my personal life and I have seen dramatic improvement. Earlier this month, I requested him to share his views and suggestions with all of us on this blog. He kindly accepted the request and returned with three amazing blog posts and a plethora of giveaways.

Let us read the John Sonmez in his own words in this second part of a three part series. You can read the first part of this series over here.


Welcome back, this is the second post in my three-part series where I am bringing you three of my best tips, from my new book, “Soft Skills: The Software Developer’s Life Manual”, to help you land your dream job.

You can find part one of the series here.

Oh, and don’t forget the giveaway for How To Market Yourself as a Software Developer. I’ll be doing at the end of the next post. You won’t want to miss that, so bookmark this page and check back tomorrow.

Tip #2: Get a personal referral

Sure, getting a personal referral is easy enough if you already know someone at the company you are applying for, but what if you don’t?

It can still be done. You just might have to work a little harder at it.

Start with who you know that might know someone at the company you are applying for. Ask them to introduce you and offer to buy them a cup of coffee—or even a nice lunch.

But, do yourself a favor, don’t try and “sell them” or “network,” instead just have a friendly chat and try to get to know them a little better.

A little known secret to the interview process is that interviewers almost always hire people they like. Yes, technical skills are important, but often the deciding factor in who gets the offer and who gets the rejection letter has completely to do with who the interviewer liked more.

Now, the person who refers you for a job might not be the same person who interviews you, but if you want to get a good referral, the same principle applies.

What if you don’t know someone who knows someone at the company?

If you don’t have someone that can make the introduction for you, make it yourself.

Try and find a few employees at the company you are applying at and see if they have blogs or a twitter account. Comment on their blogs or start interacting with them on social media.

There is almost always a way to reach someone at a company that you want to work at.

Today, the internet makes it very easy for you to build relationships with people at companies before you even apply there.

If you are serious about landing your dream job and increasing your salary, you should never apply for a job without a personal referral.

It might take some time to build up a relationship, but getting that personal referral can make it so you are not only a lot more likely to get the job, but also more likely to get a higher offer.

I talk about how I used this technique to land a six-figure job, working from home in “Soft Skills”, so if you are interested in getting more detail on this technique, go check it out.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – How to Know Backup History of Current Database?

Some blog post and scripts are like ever green references. One such blog has been Get Database Backup History for a Single Database. A number of you have pinged me and for amendments to that script with various ideas. Since that post was written more than 4+ years now, I thought it would be worthwhile to change it to current trends of SQL Server 2014 additions. What are the changes made to the previous script? The script will answer few questions like:

  • What were the different backups taken on the current DB? Remove the condition mentioned below and it can be generic enough for all DBs on your server.
  • Which user was involved in taking the backup?
  • What is the type of backup we are talking about?
  • Recovery Model and Database compatibility level of the DB at the time the backup was taken. I remember using this for a user, who used to change the DB compatibility in their script before a backup. It was basically a copy-paste problem from an internet script. This data helped there.
  • Size of backup – Both compressed and uncompressed.
  • If the Backup was password protected.
  • Finally, when the backups were taken.

So let us look at the script next. Feel free to modify the same as you wish.

-- Recent backup history for the current DB
SELECT s.database_name 'Database',
s.recovery_model 'Recovery Model',
s.compatibility_level,
s.USER_NAME 'Backup by Username',
CASE s.TYPE
WHEN
'D' THEN 'Full'
WHEN 'I' THEN 'Diff'
WHEN 'L' THEN 'Log'
END 'Backup Type',
CONVERT(VARCHAR(20), s.backup_finish_date, 13) 'Backup Completed',
CAST(mf.physical_device_name AS VARCHAR(100)) 'Physical device name',
DATEDIFF(minute, s.backup_start_date, s.backup_finish_date) 'Duration Min',
CAST(ROUND(s.backup_size * 1.0 / ( 1024 * 1024 ), 2) AS NUMERIC(10, 2)) 'Size in MB',
CAST(ROUND(s.compressed_backup_size * 1.0 / ( 1024 * 1024 ), 2) AS NUMERIC(10, 2)) 'Compressed Size in MB',
CASE WHEN LEFT(mf.physical_device_name, 1) = '{' THEN 'SQL VSS Writer'
WHEN LEFT(mf.physical_device_name, 3) LIKE '[A-Za-z]:\%' THEN 'SQL Backup'
WHEN LEFT(mf.physical_device_name, 2) LIKE '\\' THEN 'SQL Backup'
ELSE mf.physical_device_name
END 'Backup tool',
s.is_copy_only,
s.is_password_protected,
s.is_force_offline /* for WITH NORECOVERY option */
FROM   msdb.dbo.backupset s
INNER JOIN msdb.dbo.backupmediafamily mf ON s.media_set_id = mf.media_set_id
WHERE  s.database_name = DB_NAME() AND  -- remove this condition if you want all DBs
s.backup_finish_date > DATEADD(MONTH, -3, GETDATE()) -- Get data for past 3 months
ORDER BY s.backup_finish_date DESC;

A sample output would look like:

There can be more additional fields added to this script like: If encryption is enabled, Collation information, LSN information, if backup has checksum etc. Do let me know which additional information do you use in your environment to know your backups better.

Reference: Pinal Dave (http://blog.sqlauthority.com)

Developer – 3 Tips Every SQL Expert Needs to Know to Land the Perfect Job (Part 1 of 3)

I am very fortunate that I know John Sonmez in person. He is one of those individuals who understands the young mind of developers and depth of software development industry. I am a regular follower of his books, videos and ideas. I have personally implemented many of his ideas in my personal life and I have seen dramatic improvement. Earlier this month, I requested him to share his views and suggestions with all of us on this blog. He kindly accepted the request and returned with three amazing blog posts and a plethora of giveaways.

Let us read the John Sonmez in his own words in this first part of a three part series.


You’ve probably invested a lot of time learning about SQL and databases.

That’s great. If you are reading this blog, you are probably the kind of person who really cares about your work and has a passion for what you are doing.

You are already probably ahead of 90% of your peers.

But…

What if I told you that there was another way to get even further ahead—a way that most technical professionals ignore?

Well, there is. In fact, what I am about to teach you in this 3-part series of posts, are some of the soft skills that I used to increase my income by over 500% (yes, that is 5 times—no typo), and live a happier, much more fulfilling life.

Oh, and if that isn’t enough for you. If you hang around for the last post in this 3-part series, I’ve got something extra special for you. I’m going to be giving away How To Market Yourself as a Software Developer.

Most technical professionals think getting a good job is about showing how technically competent you are.

That is exactly what I used to think, so I focused all my efforts on raising my technical skills and learning how to solve hard problems on a whiteboard.

That was before I discovered that the majority of high-paying jobs are given to people who figure out how to get in through the back door—not those who are the most technically competent.

I know it’s difficult to believe.

I had a hard time believing it myself, but many sources—including this NY Times article—are showing that a majority of hires are coming from other sources than the traditional submit-a-resume-and-pass-an-interview process.

So, if you want to land that perfect job, you’ve got to be more than just technically competent.

In my new book, “Soft Skills: The Software Developer’s Life Manual,” I’ve dedicated a whole section of the book to improving your career, landing your dream job and increasing your income. In the next few posts, I’ll share with you some of my best secrets from the book. (You can check out the first chapter for free here.)

Tip #1: Start and maintain a highly focused blog

Pinal Dave has done an excellent job of this and he is reaping the benefits. Do you think Pinal Dave would ever have to do an interview for a position at a company hiring a SQL expert? I don’t think so. I think most employers would want to hire him on the spot, just because of his excellent reputation—which he built mostly from his blog.

I, myself, have been given job offers—completely bypassing the interview process altogether—because the interviewer had been reading my blog.

The key is to get started and be consistent.

I once interviewed Pinal Dave for a course I was creating on how to market yourself as a software developer and he told me that he woke up every morning and wrote a blog post before he even brushed his teeth.

His dedication to writing every day has allowed him to earn a great income off of this blog and to reach around 2 million technical professionals who read this blog every month.

Whenever I speak at conferences about marketing yourself and soft skills for technical people, I always ask how many people have a blog. Usually about half of the room will raise their hand. But, guess what happens when I ask how many people have posted on their blog weekly, for at least the last year?

Often out of a room of 200-300 developers, there will be just 1 or 2 hands raised.

Want to set yourself apart?

Want to stand out and get noticed and have employers knocking down your door to hire you?

Start a highly focused blog—that means pick a specific topic, like Pinal did with SQLAuthority—and blog at least once a week.

Anyone can do it and it’s much easier than you think. To find out more about how to create a successful blog, either check out my blogging chapter in Soft Skills or you can enroll in this free 3-week email course I created that takes you through the process step-by-step.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Error: Fix: Msg 5133, Level 16, State 1, Line 2 Directory lookup for the file failed with the operating system error 2(The system cannot find the file specified.) – Part 2

Errors are a great starting point for learning. Especially, my inbox is always flooded with atleast 2-3 error messages almost every other day. People keep asking me how to solve them, what to do when we encounter an error and so on. Long time back I wrote a simple blog which talks about error message 5133 while creating database. You can read it here:

SQL SERVER – Error : Fix : Msg 5133, Level 16, State 1, Line 2 Directory lookup for the file failed with the operating system error 2(The system cannot find the file specified.)

One of the readers sent me email that she is simply running Create Database <DatabaseName> command and still get the above error. In my blog I mentioned “There must be some typo or error in filepath” but in this case there is no path given. This time I wanted to reproduce the error and was intrigued to why they were getting this error with the simple TSQL Create database command.

I spent some time researching and spoke to few friends about this. They informed that there is a setting in SQL Server which has default path of database files which is picked if nothing is specified. And it made complete sense. Based on this information,  I was finally able to reproduce the error by following his instructions. Here is the error:

Msg 5133, Level 16, State 1, Line 14
Directory lookup for the file “E:\InvalidPath\SQLAuth.mdf” failed with the operating system error 2(The system cannot find the file specified.).
Msg 1802, Level 16, State 1, Line 14
CREATE DATABASE failed. Some file names listed could not be created. Check related errors.

If you look at the path it is “E:\InvalidPath\SQLAuth.mdf”. The path is picked from the below registry value:

HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft SQL Server\MSSQL12.SQL2014\MSSQLServer

In my case, I have SQL 2014 named instance called SQL2014 – that’s why we are seeing MSSQL12.SQL2014

Here is the T-SQL way to get the values.

SELECT SERVERPROPERTY('InstanceDefaultDataPath') AS [Default_Data_path]
SERVERPROPERTY('InstanceDefaultLogPath') AS  [Default_log_path]

We can change it via SQL Server Management Studio too. You can right click on server node and go to properties. Then choose “Database Settings” tab.

Following is the T-SQL way to achieve the change.

USE [master]
GO
EXEC xp_instance_regwrite N'HKEY_LOCAL_MACHINE', N'Software\Microsoft\MSSQLServer\MSSQLServer',
N'DefaultData', REG_SZ, N'E:\InvalidPath'
GO
EXEC xp_instance_regwrite N'HKEY_LOCAL_MACHINE', N'Software\Microsoft\MSSQLServer\MSSQLServer',
N'DefaultLog', REG_SZ, N'E:\InvalidPath'
GO

NOTE: Once you have changed the value, SQL Service restart is needed so that SQL can pick-up changed values.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Error: Msg 701, Level 17, State 103. There is insufficient system memory in resource pool to run this query

Talking and exploring In-Memory topics inside SQL Server 2014 has been interesting to me. When I wrote the blog around table variable not being just an In-Memory structure, one of my course listener (SQL Server 2014 Administration New Features) pinged me on twitter to ask, if In-Memory OLTP was really In-Memory? Wouldn’t SQL Server like to swap the data or memory data to pagination file when there is memory pressure? I told them the concept of In-Memory is that data always resides in memory and the reason for feature name “In-Memory OLTP”.

The next question that came out of this interaction, what happens when we hit the memory boundary? Will SQL Server error out even if there is Memory available in the system or do something else. This was an easy setup to try.

Script Usage

We will create an In-Memory database, restrict it to have limited memory and add some rows to see if there is any error. The basic 4 steps to follow are:

-- Step 1: Create the Resource Pool. Limit to 5% memory.
CREATE RESOURCE POOL InMemory_Pool WITH (MAX_MEMORY_PERCENT = 5);
ALTER RESOURCE GOVERNOR RECONFIGURE;
GO

-- Step 2: Create the InMemory DB
CREATE DATABASE InMemory
ON PRIMARY(NAME = InMemoryData,
FILENAME = 'd:\data\InMemoryData.mdf', size=100MB),
-- Memory Optimized Data
FILEGROUP [InMem_FG] CONTAINS MEMORY_OPTIMIZED_DATA(
NAME = [InMemory_InMem_dir],
FILENAME = 'd:\data\InMemory_InMem_dir')
LOG ON (name = [InMem_demo_log], Filename='d:\data\InMemory.ldf', size=50MB)
GO

-- Step 3: Bind the resource pool to DB
EXEC sp_xtp_bind_db_resource_pool 'InMemory', 'InMemory_Pool'
GO

-- Step 4: For RG to take effect, make DB Offline and Online
USE MASTER
GO
ALTER DATABASE InMemory SET OFFLINE
GO
ALTER DATABASE InMemory SET ONLINE
GO

After this we will create a wide table and add rows to exhaust the memory for this resource pool.

USE InMemory
GO
-- Step 5: Create a Memeory Optimized Table
CREATE TABLE DummyTable_Mem (ID INT NOT NULL,
Name CHAR(8000) NOT NULL
CONSTRAINT ID_Clust_DummyTable_Mem PRIMARY KEY NONCLUSTERED HASH (ID) WITH (BUCKET_COUNT=1000000))
WITH (MEMORY_OPTIMIZED=ON, DURABILITY=SCHEMA_ONLY)
GO

-- Step 6: Add a lot of rows to get the error
SET NOCOUNT ON
DECLARE
@counter AS INT = 1
WHILE (@counter <= 1000000)
BEGIN
INSERT INTO
dbo.DummyTable_Mem VALUES(@counter, 'SQLAuthority')
SET @counter = @counter + 1
END
GO

The Step 6 will not complete because of insufficient memory. We will get an error as shown below:

The statement has been terminated.
Msg 701, Level 17, State 103, Line 49
There is insufficient system memory in resource pool 'InMemory_Pool' to run this query.

To add more rows to the table:

  • Make sure more memory is allocated to the resource pool and reconfigure the same.
  • Delete some rows from In-Memory tables on this database to make space for new allocations.

Let me help you clean up the script after this experiment.

-- Clean up
USE MASTER
GO
DROP DATABASE InMemory
GO
DROP RESOURCE POOL InMemory_Pool
ALTER RESOURCE GOVERNOR RECONFIGURE;
GO

To learn such interesting SQL Server 2014 enhancements, feel free to listen to my Pluralsight course for more such topics.

Reference: Pinal Dave (http://blog.sqlauthority.com)

Interview Question of the Week #004 – List All Columns and Their Data Type for a View

Earlier this week I wrote a blog about finding stored procedure parameters and their data types. After that blog I received few emails from my blog readers asking for similar script for a view. I asked them what did they like about the script and they said that it gives us base datatype and user defined data type as well. So I have put some more efforts to write similar script about view. This is a fine example of how each one of you out there inspire me with some great content ideas. These keep these emails coming my way.

Here is the question – How to list all columns and their datatype for a view in SQL Server?

Usage of script

To use the below script you need to replace the view name (vEmployee in sample code) and schema name (HumanResources in sample code). Also make sure you are in same database which has stored procedure (AdventureWorks2014 is sample code):

USE AdventureWorks2014
GO
DECLARE  @ViewName NVARCHAR(4000)
       ,
@SchemaName NVARCHAR(4000)
SELECT   @ViewName = N'vEmployee'
      
,@SchemaName = N'HumanResources'
SELECT c.NAME AS [Name]
  
,CAST(ISNULL(ic.index_column_id, 0) AS BIT) AS [InPrimaryKey]
  
,CAST(ISNULL((
              
SELECT TOP 1 1
              
FROM sys.foreign_key_columns AS colfk
              
WHERE colfk.parent_column_id = c.column_id
                  
AND colfk.parent_object_id = c.OBJECT_ID
              
), 0) AS BIT) AS [IsForeignKey]
  
,u_t.NAME AS [DataType]
  
,ISNULL(b_t.NAME, N'') AS [SystemType]
  
,CAST(CASE
          
WHEN b_t.NAME IN (
                  
N'nchar'
                  
,N'nvarchar'
                  
)
               AND
c.max_length <> - 1
              
THEN c.max_length / 2
          
ELSE c.max_length
          
END AS INT) AS [Length]
  
,CAST(c.PRECISION AS INT) AS [NumericPrecision]
  
,CAST(c.scale AS INT) AS [NumericScale]
  
,c.is_nullable AS [Nullable]
  
,c.is_computed AS [Computed]
  
,ISNULL(s.NAME, N'') AS [XmlSchemaNamespaceSchema]
  
,ISNULL(xsc.NAME, N'') AS [XmlSchemaNamespace]
  
,ISNULL((
          
CASE c.is_xml_document
              
WHEN 1
                  
THEN 2
              
ELSE 1
              
END
          
), 0) AS [XmlDocumentConstraint]
  
,CAST(c.is_sparse AS BIT) AS [IsSparse]
  
,CAST(c.is_column_set AS BIT) AS [IsColumnSet]
  
,c.column_id AS [ID]
FROM sys.all_views AS v
INNER JOIN sys.all_columns AS c ON c.OBJECT_ID = v.OBJECT_ID
LEFT
JOIN sys.indexes AS i ON i.OBJECT_ID = c.OBJECT_ID
  
AND 1 = i.is_primary_key
LEFT JOIN sys.index_columns AS ic ON ic.index_id = i.index_id
  
AND ic.column_id = c.column_id
  
AND ic.OBJECT_ID = c.OBJECT_ID
  
AND 0 = ic.is_included_column
LEFT JOIN sys.types AS u_t ON u_t.user_type_id = c.user_type_id
LEFT JOIN sys.types AS b_t ON (
      
b_t.user_type_id = c.system_type_id
      
AND b_t.user_type_id = b_t.system_type_id
      
)
   OR (
       (
b_t.system_type_id = c.system_type_id)
       AND (
b_t.user_type_id = c.user_type_id)
       AND (
b_t.is_user_defined = 0)
       AND (
b_t.is_assembly_type = 1)
       )
LEFT JOIN sys.xml_schema_collections AS xsc ON xsc.xml_collection_id = c.xml_collection_id
LEFT JOIN sys.schemas AS s ON s.schema_id = xsc.schema_id
WHERE (v.TYPE = 'V')
   AND (
      
v.NAME = @ViewName
      
AND SCHEMA_NAME(v.schema_id) = @SchemaName
      
)
ORDER BY [ID] ASC

Here is the sample execution. I have highlighted the modification needed to use the script.

I hope these scripts will help you in your environments. I would love to hear back from you how these can be enhanced if possible.

Click to Download Scripts

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – How to Bind Resource Governor for InMemory Enabled Databases?

I have done a number of courses for Pluralsight. Last year when SQL Server 2014 was released, I planned to do a session on the same. The course SQL Server 2014 Administration New Features was a lot of hardwork but it did give me a chance to learn something new and share. Do let me know if you have every got a chance to listen to this course. Would love to hear your feedback.

As part of the course, one of the module included the basics of InMemory capabilities of SQL Server 2014. One of the listener from the course pinged me on Twitter with some interesting conversation. This inspired me to write this blog post.

Follower: Thanks for the InMemory basics, I had a simple doubt.

Pinal: Yes, please tell me. Let me see if I can answer the same.

Follower: I plan to upgrade my server to SQL 2014.

Pinal: Glad the course is helping you. Is there any doubt?

Follower: Yes, I might be planning to use the InMemory capabilities for couple of databases as part of application upgrade.

Pinal: Great.

Follower: I want to know can I restrict the amount of memory a particular InMemory OLTP Database can take.

Pinal: Can you please elaborate a bit please?

Follower: Simple Pinal, I want one DB not to take more than 20% of Memory on my server and the other InMemory DB not to take more than another 40% Memory available on the server.

Pinal: Interesting.

Follower: As part of upgrade I am consolidating and hence these restrictions can be of great help.

Pinal: Now I get it, it is possible with Resource Governor. Havent you tried it?

Follower: I think these are great pointers, I will dig it up. Thanks again.

Pinal: You are welcome. I will write it as a blog for sure in future.

This conversation has been in my minds for a while. It has taken some time to finally get this blog. The script one needs to write is simple. Let me walk you through the same.

  1. Create the Resource Pool
  2. Create the InMemory OLTP Filegroup enabled DB
  3. Bind the Resource Pool to our database
  4. Check the DB metadata
  5. Make the DB Offline and Online to make the changes take effect
  6. Clean up

As the script says, let us first create our Resource Group.

-- Step 1: Create the Resource Pool. Limit to 40% memory.
CREATE RESOURCE POOL InMemory_Pool WITH (MAX_MEMORY_PERCENT = 40);
ALTER RESOURCE GOVERNOR RECONFIGURE;
GO

Next is to create the InMemory DB. This is same as used in the previous blog – Beginning In-Memory OLTP with Sample Example.

-- Step 2: Create the InMemory DB
CREATE DATABASE InMemory
ON PRIMARY(NAME = InMemoryData,
FILENAME = 'd:\data\InMemoryData.mdf', size=100MB),
-- Memory Optimized Data
FILEGROUP [InMem_FG] CONTAINS MEMORY_OPTIMIZED_DATA(
NAME = [InMemory_InMem_dir],
FILENAME = 'd:\data\InMemory_InMem_dir')
LOG ON (name = [InMem_demo_log], Filename='d:\data\InMemory.ldf', size=50MB)
GO

The next step is where the magic begins. We need to bind the DB and the resource pool. This can be achieved using the next command:

-- Step 3: Bind the resource pool to DB
EXEC sp_xtp_bind_db_resource_pool 'InMemory', 'InMemory_Pool'
GO

The success for this step can be viewed with this message:

A binding has been created. Take database ‘InMemory’ offline and then bring it back online to begin using resource pool ‘InMemory_Pool’

The next logical step is to check the metadata if the same has been mapped. Use the sys.databases DMV for this.

-- Step 4: Check the Database metadata
SELECT dbs.database_id, dbs.name, dbs.resource_pool_id
FROM sys.databases dbs
WHERE name LIKE 'InMemory'
GO

Just like how resource governor configuration needs to be reconfigured. We need to do something similar for databases to make this changes take effect. We will need to take the DB offline and bring it online.

-- Step 5: For RG to take effect, make DB Offline and Online
USE MASTER
GO
ALTER DATABASE InMemory SET OFFLINE
GO
ALTER DATABASE InMemory SET ONLINE
GO

That is it. We are good now. Our InMemory DB will not take more than 40% of the memory allocated to SQL Server. Though this was a simple concept, I thought was worth a share. If you would like to clean up this experiment, please use the below script.

-- Clean up
USE MASTER
GO
DROP DATABASE InMemory
GO
DROP RESOURCE POOL InMemory_Pool
ALTER RESOURCE GOVERNOR RECONFIGURE;
GO

To learn such interesting SQL Server 2014 enhancements, feel free to listen to my Pluralsight course for more such topics.

Reference: Pinal Dave (http://blog.sqlauthority.com)