SQL SERVER – JSON Support for Transmitting Data for SQL Server 2016 – Notes from the Field #114

Kathi SQL SERVER   JSON Support for Transmitting Data for SQL Server 2016   Notes from the Field #114[Note from Pinal]: In this episode of the Notes from the Field series database expert Kathi Kellenberger explains about JSON Support for Transmitting Data for SQL Server 2016. Kathi is an amazing instructor, she was the SQL author I have read in my early career. The reason, I love SQL Server because her writing has instigated love for this technology in me. Today she brings a relatively interesting topic for database experts. Read the experience of  Kathi in her own words.


My favorite aspect of working with SQL Server is the T-SQL language. I love working with data, writing queries, and tuning queries when they don’t perform well. Whenever a new version of SQL Server is on the way, I can’t wait to hear what has been added to T-SQL.

I was really hoping to see additional T-SQL windowing functionality with 2016, but, so far, nothing has been announced. There is however, a very interesting programmability enhancement on the way: JSON support.

JSON, which stands for JavaScript Object Notation, is used for transmitting data and is used in many programming languages. JSON looks a bit like XML with square [] and curly {} brackets instead of angle <> brackets.

Here is an example comparing XML to JSON with data from the Production.Product table.

XML JSON
<Product>
  <Name>HL Road Frame - Black, 58</Name>
  <ListPrice>1431.5000</ListPrice>
</Product>
<Product>
  <Name>HL Road Frame - Red, 58</Name>
  <ListPrice>1431.5000</ListPrice>
</Product>
<Product>
  <Name>Sport-100 Helmet, Red</Name>
  <ListPrice>34.9900</ListPrice>
</Product>
<Product>
  <Name>Sport-100 Helmet, Black</Name>
  <ListPrice>34.9900</ListPrice>
</Product>
<Product>
  <Name>Mountain Bike Socks, M</Name>
  <ListPrice>9.5000</ListPrice>
</Product>

{
"Product": [
{
"Name": "HL Road Frame - Black, 58",
"ListPrice": 1431.5
},
{
"Name": "HL Road Frame - Red, 58",
"ListPrice": 1431.5
},
{
"Name": "Sport-100 Helmet, Red",
"ListPrice": 34.99
},
{
"Name": "Sport-100 Helmet, Black",
"ListPrice": 34.99
},
{
"Name": "Mountain Bike Socks, M",
"ListPrice": 9.5
}
]
}

The additions to T-SQL to support JSON are listed below. There is one new clause and four functions:

Name Type Purpose
FOR JSON Clause Return JSON from a SELECT statement
OPENJSON Function Return JSON data in tabular format
ISJSON Function Returns TRUE if a string is valid JSON string
JSON_VALUE Function Returns a single value from a

JSON string

JSON_QUERY Function Returns an array of values from a JSON string

I used the FOR JSON clause to generate the example JSON data above. Here is the code I used:

SELECT TOP(5) Name, ListPrice
FROM Production.Product
WHERE ListPrice > 0 AND Name NOT LIKE '%Seat%'
FOR JSON PATH, ROOT('Product');

There are several options, so be sure to take a look at the documentation for more information. Here is an example script demonstrating the functions:

--Save a string in JSON format to a variable
DECLARE @Books NVARCHAR(MAX) = N'
{"Books":
{"BookList":
[ {"Book":
{ "Title":"Microsoft SQL Server T-SQL Fundamentals",
"Authors":"Itzik Ben Gan"
}
},
{"Book":
{
"Title":"The Data Warehouse Toolkit: The Definitive Guide to Dimensional Modeling",
"Authors":"Ralph Kimball; Margy Ross"
}
},
{"Book":
{
"Title":"SQL Server Concurrency: Locking, Blocking, and Row Versioning",
"Authors":"Kalen Delaney"
}
}
]
}
}'
;
IF ISJSON(@Books) = 0 BEGIN
   PRINT
'Invalid JSON string';
END
ELSE BEGIN
  
--Turn the JSON string into tabular format
  
SELECT *
  
FROM OPENJSON(@Books,'$.Books.BookList')
  
WITH (
      
Title NVARCHAR(100) '$.Book.Title',
      
Authors NVARCHAR(100) '$.Book.Authors');
  
--Return the BookList array in JSON format
  
SELECT JSON_QUERY(@Books,'$.Books.BookList') AS [Book Array];
  
--Return the second row as JSON
  
SELECT JSON_QUERY(@Books,'$.Books.BookList[1]') AS [Second Book];
  
--Return the title from the second row
  
SELECT JSON_VALUE(@Books,'$.Books.BookList[1].Book.Title') AS [Book Title];
END;

Here are the results:
114 SQL SERVER   JSON Support for Transmitting Data for SQL Server 2016   Notes from the Field #114
The most challenging aspect for me when writing the previous example, was creating a valid JSON string. The path attribute, that string starting with $ used with the functions, looks a bit intimidating, but it is actually simple. The $ is the starting point. Follow the path in the data to get what you want.

Let’s take a look at the last query:

SELECT JSON_VALUE(@Books,'$.Books.BookList[1].Book.Title') AS [Book Title];

The JSON_VALUE function takes two parameters, the JSON string and a path. In this case, I am specifying the item from the array to extract. It is found in Books and the array is called BookList. I want the item at index 1. The item is called Book and I am retrieving the Title property.

The new functionality will make it easier for developers to work with SQL Server data regardless of the programming language they use.

Kathi Kellenberger is the author of Beginning T-SQL, Expert T-SQL Window Functions in SQL Server, and PluralSight course T-SQL Window Functions.

notes 82 3 SQL SERVER   JSON Support for Transmitting Data for SQL Server 2016   Notes from the Field #114If you would like to learn more about T-SQL window functions, be sure to check out my latest book Expert T-SQL Window Functions in SQL Server or my Pluralsight course T-SQL Window Functions.

If you want to get started with SQL Server with the help of experts, read more over at Fix Your SQL Server.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – 2016: Creating Simple Temporal Table

Temporal tables are a new feature that was introduced in SQL Server 2016. Temporal tables give application developers to view the state of data at a point in time. Temporal tables can also be used for logging purposes if required. I wanted to explore how this feature can be used. The documentation has been interesting and I went ahead in trying out how this can be enabled.

One of the most used use case for this feature would be to find the data values in a past date. This can now be easily configured in SQL Server 2016. Let us go ahead and create a database for use and then create a table with the Temporal history database created by default / automatically. The syntax for this would look like:

USE TemporalDB
GO
--Simple temporal table with history table automatically created (name of history table is MSSQL_TemporalHistoryFor_<object_id of table>) :
CREATE TABLE dbo.TemporalExample
(
ID INT NOT NULL PRIMARY KEY,
ValidFrom DATETIME2 GENERATED ALWAYS AS ROW START NOT NULL,
ValidTo DATETIME2 GENERATED ALWAYS AS ROW END NOT NULL ,
Period FOR system_time (ValidFrom, ValidTo)
)
WITH (SYSTEM_VERSIONING = ON)
GO

As you can see below, in my SQL Server 2016 database, the table has been created and the system generated history table is linked with it.

temporal table 01 SQL SERVER   2016: Creating Simple Temporal Table

I wanted to understand if we can get the same value from system tables. I have used the sys.tables DMV to select the same.

SELECT name [Original_Table],
OBJECT_NAME(history_table_id) [History_Table_Name] FROM sys.tables
WHERE history_table_id IS NOT NULL

The output for this is shown below:

temporal table 02 SQL SERVER   2016: Creating Simple Temporal Table

Since these tables are linked, it is not possible to drop the main table without stopping the system versioning. Since we created the table, let us look at the possibility of dropping the tables. We are going to stop versioning and then drop both these tables.

USE [TemporalDB]
GO
ALTER TABLE [dbo].[TemporalExample] SET ( SYSTEM_VERSIONING = OFF)
GO
DROP TABLE [dbo].[TemporalExample]
GO
-- Add the history table which was created in your databases
DROP TABLE [dbo].[MSSQL_TemporalHistoryFor_565577053]
GO

In the above code, your history table data would be different and please make the changes appropriately. Let us go ahead by cleaning the database that we created to experiment Temporal.

USE MASTER
GO
DROP DATABASE TemporalDB
GO

What are some of the use cases you think temporal will be used in your environment? We will surely blog more functions of using temporal in the future. I am sure you will find this interesting.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL Server Auditing and Recovery With ApexSQL Log 2016

ApexSQL Log is a powerful SQL Server transaction reader for forensic auditing and rollback of malicious and/or unintended changes for Microsoft SQL Server. It is an ideal cost solution for recovery, row changes and/or before-after auditing. ApexSQL Log uses technology which enables it to read online transaction log files or transaction log backup files in order to create auditing results including before and after views of the data as well as the full row history of all changes.

From the recovery standpoint, ApexSQL Log provides sophisticated undo/redo capabilities which enable it to create TSQL scripts which can be used to completely reverse or replay all or only selected transactions read from the online transaction log file and added transaction log backups.

To ensure that transaction log files contain the information which ApexSQL Log will use for auditing, it is important to ensure that the SQL Server doesn’t overwrite information in the transaction log file, which it does when database recovery model is set to simple. With this in mind, ensure that your databases are in the full recovery model to provide valid information for auditing and to ensure that you have the highest chance possible to recover the data when a disaster strikes and important data is lost.

To start things up, let’s quickly decide on how we want to perform the auditing: locally or remotely.

Depending on the environment, both local and remote auditing may be preferred, and the following overview of both will help you decide which to choose.

Local auditing

ApexSQL Log can be installed directly on the server which hosts the SQL Server, and auditing is hence performed locally. Since ApexSQL Log has no performance overhead during audit trail capture, it is recommended to use ApexSQL Log locally when possible, to avoid any performance issues which can be caused by network limitations or congestion in high traffic environments.

Remote auditing

If you choose to install ApexSQL Log on specific workstation, and to connect to the SQL Server remotely, it is important to know that you will need to install server-side components which enable remote auditing. These server side components are not installed on the SQL Server itself, but on the machine which hosts it, and are actually a simple service which allows remote auditing of online transaction log file. More on the components can be read in this article.

Installation of these components is straightforward as it gets. Simply start the installation executable on the server and choose the option to ‘install server-side components’.  An alternative is to install them directly from the GUI when connecting to the remote SQL Server. The application will inform you when the server-side components are not installed on the remote server, and offer to install them from the GUI itself.

How it works

ApexSQL Log has a straightforward wizard which will guide the user through the auditing or recovery process.

In order to auditing specific transaction log files, it is necessary to connect to the database to which those transaction log files/backups belong to. Hence, once the application is started, user is required to provide database connection details. After choosing a SQL Server and authentication method (and providing appropriate credentials), the database can be selected from the drop-list.

apexsqllog1 SQL Server Auditing and Recovery With ApexSQL Log 2016

Advancing through the wizard, the next step requires user to specify which data sources (transaction log files, backups or .ldf) will be used for the auditing job. ApexSQL Log automatically loads the online transaction log and all related transaction log files/backups from the default SQL Server folder. Additional files can be loaded manually by using Add file dialog, or by determining a naming pattern policy to add multiple files at once. Simply check files from the list to include them in the auditing process.

apexsqllog2 SQL Server Auditing and Recovery With ApexSQL Log 2016

Note: when selecting transaction log files for auditing, it is important to provide a full chain of transaction log backups if you want to provide a more continuous data stream, and also provide vital transactional information. More on How to create and maintain a full chain of transaction log backups can be read here.

The next step of the wizard which follows the data sources selection process is the output selection. The choices are: Export results, before-after report, undo/redo script or to open results in grid.

apexsqllog3 SQL Server Auditing and Recovery With ApexSQL Log 2016

Export results

This option is pretty self-explanatory – the auditing results will be exported to one of the supported formats. The following article provides detailed information on exporting in ApexSQL Log.

apexsqllog4 SQL Server Auditing and Recovery With ApexSQL Log 2016

Creating a before-after report

This specific report with full before-after (change) details can be created as another ApexSQL Log output, which is perfect for investigating what change has occurred, and what was the original form of the changed row(s) – the ‘what, who and when’ investigation. Here is an article on how to create a data change before-after report

 apexsqllog5 SQL Server Auditing and Recovery With ApexSQL Log 2016

Undo/Redo script creation

This option creates a SQL Server script to revert back any unwanted changes or to replay them vs any other table/database. More on how to revert back unwanted changes or replay changes can be read here

Open results in grid

This is the option which allows in-debth analysis of audited data directly from the GUI. Auditing results will be displayed in the grid/table format where they can be further analyzed or filtered to more specific results. All three previously described outputs can be created from the grid itself, so if you are not perfectly sure on the output, and would need to inspect/modify the results before creating an export or undo/redo, opening results in the grid seems as the best option to choose here

Filtering

The final step in wizard regardless of the output type is the filter setup. ApexSQL Log offers various filters, and user can specify various date/time filters, operation filters, table filters, or use advanced options to further refine the output.

apexsqllog6 SQL Server Auditing and Recovery With ApexSQL Log 2016

The filter setup also provides one really useful feature –continuous auditing. Continuous auditing is used in place of the date/time filter, and enables auditing job on the database to resume exactly where the previous auditing job has finished, which ensures that nothing is missed between the regular/continuous auditing jobs. More on the continuous auditing can be read in this article.

The ‘Grid’ overview

Once the auditing has been completed, all results are shown in a grid, from where they can be analyzed. Additional filtering is available in grid, so it is quite easy to refine received output or to fine tune results to see only specific transactions.

apexsqllog7 SQL Server Auditing and Recovery With ApexSQL Log 2016

The grid overview allows the user the ‘true’ insight into the transaction log files, and various information can be found in several different panes.

Operation details

This pane shows full operation details for each and every audited transaction. This view contains full information on the before-after auditing, and you can see the before (old) and after (new) values for selected operation

apexsqllog8 SQL Server Auditing and Recovery With ApexSQL Log 2016

Transaction information

As the name suggests, this pane contains information on the transaction in which the selected operation was included into

 apexsqllog9 SQL Server Auditing and Recovery With ApexSQL Log 2016

Undo & Redo script panes provide a quick glance to the rollback/replay scripts, while the complete Row history of the selected row as far as the transaction log files reach can be seen in the Row history pane.

apexsqllog10 SQL Server Auditing and Recovery With ApexSQL Log 2016

As previously mentioned, all output options are available in the grid overview, so user can create all exports, before-after report or undo/redo scripts from the GUI. In addition to working from GUI, ApexSQL Log fully supports CLI, and all commands and operations supported in GUI are available in CLI – this enables automation of auditing with ApexSQL Log via windows scheduler or similar tools.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – How to Connect Using NT AUTHORITY\SYSTEM Account?

Sometimes it is needed to connect to SQL Server using System account. Don’t literally ask me if this is a valid scenario in first place. Someone just pinged to say – they want to do it. I was personally not sure why, but they had their own internal requirement to do the same.

It is not possible to provide windows credential in SSMS and they are always grayed out. My exploration is always to keep trying to find a solution to such typical use cases. I was able to find that PSExec can be used to achieve this.  I am a big fan of some of the tools from sysinternals. These are handy and quite small footprint of the servers. One of the tools I have day-in day-out is ZoomIt while doing presentations.

Step 1: Download PSTools from https://technet.microsoft.com/en-us/sysinternals/psexec.aspx

Step 2: Extract and open command prompt to the extracted location.

Step 3: Provide below command

psexec -i -s ssms.exe

-i parameter allow the program to run so that it interacts with the desktop of the specified session on the remote system.

-s parameter launches the process using SYSTEM account.

Here is what we would see on the command prompt.

NtAuth 01 SQL SERVER   How to Connect Using NT AUTHORITY\SYSTEM Account?

And it would launch SSMS program with User name populated as “NT AUTHORITY\SYSTEM”

NtAuth 02 SQL SERVER   How to Connect Using NT AUTHORITY\SYSTEM Account?

When I disable that account, I was getting below error in ERRORLOG

2016-02-03 15:31:07.560 Logon        Error: 18470, Severity: 14, State: 1.
2016-02-03 15:31:07.560 Logon        Login failed for user ‘WORKGROUP\SQLSERVER2016$’. Reason: The account is disabled. [CLIENT: <local machine>]

Notice that we do not see NT AUTHORITY\SYSTEM in Errorlog. We are seeing machine name followed by a dollar symbol. Since my machine is in a Workgroup called “WORKGROUP” and machine name is SQLSERVER2016, we are seeing WORKGROUP\SQLSERVER2016$

Have you ever used any such tool to troubleshoot anytime in your environment? What use cases were you able to use these tools in the past? Do let me know via comments.

Reference: Pinal Dave (http://blog.sqlauthority.com)

Interview Question of the Week #057 – What is GO in SQL SERVER?

GO is not a Transact-SQL statement; it is often used in T-SQL code. Go causes all statements from the beginning of the script or the last GO statement (whichever is closer) to be compiled into one execution plan and sent to the server independent of any other batches. SQL Server utilities interpret GO as a signal that they should send the current batch of Transact-SQL statements to an instance of SQL Server. The current batch of statements is composed of all statements entered since the last GO, or since the start of the ad hoc session or script if this is the first GO.

The GO Statement must be written in new line as it is not a T-SQL command. T-SQL statement cannot occupy the same line as GO. GO statement can contain comments.

Following is example for SQL SERVER for database Adventureworks.

USE AdventureWorks2014;
GO
DECLARE @MyMsg VARCHAR(50)
SELECT @MyMsg = 'Hello, World.'
GO ---- @MyMsg is not valid after this GO ends the batch.

—- Yields an error because @MyMsg not declared in this batch.

PRINT @MyMsg
GO

Reference: Pinal Dave (http://blog.sqlauthority.com)

MySQL – How to Generate Random Number

In MySQL, UUID() function returns Universal Unique Identifier that generates 36 characters long value which is 5 part hexadecimal numbers. If you want to create random password, you can make use of this function.

SELECT UUID() ;

Returns the string 85aeb064-8f73-11e5-85ef-02fcc4101968 (Note that this is random, when you execute you will get different value). As the total length is 36, you can make use of the result to get a random password with any length.

Suppose you want to generate 8 character length, you can generate it using

SELECT LEFT(UUID(),8) random_password ;

If you want to generate 10 character length, you can generate it using

SELECT RIGHT(UUID(),10) random_password ;

This way you can generate a random password. As UUID () returns the hexadecimal number with five parts separated by hyphen, your word will have numbers, alphabets and hyphen depends on the length you use.

There can be many ways of doing this, but this is the simplest one.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Install Error – The /UIMode setting cannot be used in conjunction with /Q or /QS

Doing repetitive tasks is something everyone hates. I am sure many of us have resorted to ingenious techniques to figure out doing the task efficiently. There is no standard tool or technique that everyone uses. Recently I was asked to install a SQL instance with same configuration on 10 servers. Who would like to log in to all those servers, run setup.exe and go through the same setup wizard? Luckily, the SQL Server setup allows to create a configuration file. The very thought of extracting it as configuration is something that made me jump out of my seat.

UIMode 01 SQL SERVER   Install Error   The /UIMode setting cannot be used in conjunction with /Q or /QS

I copied the file from above location and moved to a second server on C drive. This was the easy part because I know the exact replica of configurations are going to flow to the new servers. Then I ran the below command:

setup.exe /CONFIGURATIONFILE=”C:\ConfigurationFile.ini” /QS /IACCEPTSQLSERVERLICENSETERMS

This is a silent install that I am trying to do. The above command failed with below error which was strange to me:

The following error occurred:
The /UIMode setting cannot be used in conjunction with /Q or /QS.

Error result: -2054422500
Result facility code: 1420
Result error code: 28
Please review the summary.txt log for further details

I looked into the file and found below line.

; Parameter that controls the user interface behavior. Valid values are Normal for the full UI,AutoAdvance for a simplied UI, and EnableUIOnServerCore for bypassing Server Core setup GUI block.
UIMODE=”Normal”

I have added ; before that so that it is commented (you can remove that line also) and after that setup went fine and I was able to install it to nine servers in very short time. I also found that I can run setup remotely using PowerShell below command:

Invoke-Command -ComputerName xxx -ScriptBlock
{D:\SQL2012\setup.exe /ConfigurationFile=”D:\SQL2012\ConfigurationFile.ini”}

Have you ever used configuration file in your environments? What has been some of the automation techniques you used for installing softwares? Do let me know via comments below.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – SSIS – Get Started with the For Loop Container – Notes from the Field #113

[Notes from Pinal]: SSIS is very well explored subject, however, there are so many interesting elements when we read, we learn something new. A similar concept has been Get Started with the For Loop Container.

Tim Mitchell SQL SERVER   SSIS   Get Started with the For Loop Container   Notes from the Field #113Linchpin People are database coaches and wellness experts for a data driven world. In this 113th episode of the Notes from the Fields series database expert Tim Mitchell (partner at Linchpin People) shares very interesting conversation related to how to get started with the FOR LOOP Container.


SQL Server Integration Services is equipped with tasks and containers to make it easy to design and maintain the flow of ETL: which logic should be executed, when should it be executed, and how many times should it occur. Most SSIS developers are familiar with the sequence container and the For Each Loop container, which can be used to group together tasks and execute the same logic a discrete number of times. In addition to these, there is a lesser-known but still very useful container for controlling logic flow: the For Loop container.

Simply put, the For Loop container executes its ETL logic zero to n times. It has three functions that control how many times the loop will be executed:

  • InitExpression: Used for the initial setting of a starting variable (such as setting a counter variable to zero).
  • EvalExpression: This is the expression that evaluates whether the loop should continue. Of the three functions described here, this is the only one that requires a value – the others are optional.
  • AssignExpression: This allows the use of an assignment expression, such as incrementing a loop counter.

For those with a programming background, this look very much like a for() loop statement in the C-derived languages. Functionally, it works in the exact same way as the for() loop, by continuing to execute the contained logic as long as the control condition remains true. This helps to draw contrast between the For Each Loop and the For Loop in SSIS. The former is list-based, and will execute for every item in the list supplied to it. The latter is value-based, and will execute as long as the EvalExpression is true.

In fairness, most ETL loads lean toward the list-based approach, but there are valid cases where a value-based approach is necessary. Some of those include:

  • Processing a fixed subset of data
  • Sampling for test or validation purposes
  • Forcing a “wait state” until some milestone is reached
  • Allowing the loop logic to be executed for some specified amount of time

Configuring the For Loop Container

As noted above, the only value that is required for the For Loop container is the EvalExpression. A very simple For Loop configuration is shown below.

113 1 SQL SERVER   SSIS   Get Started with the For Loop Container   Notes from the Field #113

The above supplies only the required value – a value of true to the EvalExpression. However, this is a very poorly configured For Loop, because the loop will continue executing indefinitely! True will always be true, so there is no logical end to this loop.

A more practical design pattern would use either an initialization expression, an assignment expression, or possibly both, to constrain the number of iterations. A simple example of this is shown below. I set up an SSIS package variable, typed as an Integer and named @vLoopCounter, with a default value of 0. In the For Loop settings, I’ve used the EvalExpression to check to see if this value is less than 10, and I use the AssignExpression to increment the @vLoopContainer value by 1 for every iteration of the loop.

113 2 SQL SERVER   SSIS   Get Started with the For Loop Container   Notes from the Field #113

This example works, executing any logic contained in the For Loop exactly ten times. However, this pattern is very static. What if I want to increase the value expression to let the loop run more than 10 times? I’d need to open the package and modify it. Fortunately, the configurations expressions allow for the use of both variables and parameters. Below, the package has a couple of updates: an initial value of 1 is set for the @vLoopCounter variable, and the static comparison in EvalExpression is replaced by using the package parameter @pMaxLoops for the maximum number of loops.

113 3 SQL SERVER   SSIS   Get Started with the For Loop Container   Notes from the Field #113

In the example above, the number of maximum loops can be specified at runtime, making for a more dynamic pattern.

The examples above show only an iteration based on the number of times the loop has run. Keep in mind when using the For Loop container, this logic can be based on any statement we choose: whether a particular set of files exist, how long the For Loop has been running, how many records have been processed, or some other custom metric specified in the expression. Even with a small number of inputs controlling how many times the For Loop container will execute, the possible uses for this are many, and can be as complex as needed.

Conclusion

The For Loop container provides another way to execute repeating logic in SSIS. By using an approach similar to the for() loop in structured programming languages, the For Loop container adds more ETL flexibility through a value-based iterative pattern.

If you want me to take a look at your server and its settings, or if your server is facing any issue we can Fix Your SQL Server.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER InMemory OLTP: Understanding Optimistic Multi-Version Concurrency Concepts

InMemory is an awesome concept and I have been fortunate to learn about the same from my friends in the industry over blogs. It holds deep concepts and is sometimes tough to understand how the building blocks come together. I am not the only one doing the exploration, but a number of you also do and pass those learnings to me via your interesting questions. Thanks to each one of you who take a moment to ask me some of these questions that shake the fundamentals and make my understanding stronger. In the same lines, one of the readers wrote back to me after reading the MSDN on InMemory OLTP, what does the concepts of “Optimistic multi-version concurrency” really mean. On first thought, it looks simple, but the question was very loaded.

This blog is very much inspired by this question and I thought to pen this down as a blog with examples to show how SQL Server handles the same. We will look at various simple examples to illustrate the behavior for easy understanding.

Creating our basic DB and Table

I will do the basic structure from previous posts on what the template would be. This is a simple script to start:

USE [master]
GO
-- Below statement works only from SQL Server 2016
-- Change this appropriately if you are running in prior versions
DROP DATABASE IF EXISTS  [InMem_OLTP]
GO
CREATE DATABASE [InMem_OLTP]
ON  PRIMARY
( NAME = N'InMem_OLTP_data', FILENAME = N'C:\DATA\InMem_OLTP_data.mdf' , SIZE = 20480KB , MAXSIZE = UNLIMITED, FILEGROWTH = 1024KB ),
FILEGROUP [InMem_OLTP_InMemory] CONTAINS MEMORY_OPTIMIZED_DATA  DEFAULT
( NAME = N'InMem_OLTP_InMemory', FILENAME = N'C:\Data\InMem_OLTP_mopt' , MAXSIZE = UNLIMITED)
LOG ON
( NAME = N'InMem_OLTP_log', FILENAME = N'C:\DATA\InMem_OLTP_log.ldf' , SIZE = 20480KB, FILEGROWTH = 10%)
GO

-- Creating the table for experimentations
USE [InMem_OLTP]
GO
CREATE TABLE dbo.SalesOrder_inmem
(
order_id INT IDENTITY NOT NULL,
order_date DATETIME NOT NULL,
order_status tinyint NOT NULL,
amount FLOAT NOT NULL,
CONSTRAINT PK_SalesOrderID PRIMARY KEY NONCLUSTERED HASH (order_id) WITH (BUCKET_COUNT = 10000)
)
WITH ( MEMORY_OPTIMIZED = ON, DURABILITY = SCHEMA_ONLY)
GO
INSERT INTO dbo.SalesOrder_inmem VALUES
('1/1/2010',1,1000),
(
'1/1/2011',1,2000),
(
'1/1/2012',1,3000),
(
'1/1/2013',1,4000),
(
'1/1/2014',1,5000)

Understanding Optimistic part

Let me try to break the question into multiple parts and we will create the scenario that will showcase the optimistic part. The steps I am going to take has been explained in the table below:

Steps Session 1 Session 2
1 BEGIN TRANSACTION
2 INSERT INTO InMemory Table
3 Read from InMemory Table
4 COMMIT TRANSACTION
5 Read from InMemory Table

As you can see above, this is will be done in two sessions windows and we will switch between sessions for the demo.
-- Session 1
BEGIN TRANSACTION
INSERT INTO
dbo.SalesOrder_inmem VALUES
('1/1/2015',1,6000)

-- Session 2
SELECT * FROM dbo.SalesOrder_inmem

 SQL SERVER InMemory OLTP: Understanding Optimistic Multi Version Concurrency Concepts

As you can see, we are still seeing only 5 records and the 6th record for which the transaction is still open is not visible. This will be the state in our session 2 infinitely till the transaction in Session 1 finishes. You can see SQL Server has made a version and made sure our query on Session 2 is not getting blocked. Since we have taken an optimistic concurrency, the Session 2 is still able to query.

Let us Commit on Session 1 and do the query again on Session 2.

-- Session 1
COMMIT

-- Session 2
SELECT * FROM dbo.SalesOrder_inmem

 SQL SERVER InMemory OLTP: Understanding Optimistic Multi Version Concurrency Concepts

Now you can see, without doing anything, the additional record now pops up automagically here now.

Understanding Multi-Versioning better

The multi-versioning is not clear yet and clearly. So to make that point, let me go ahead an update a record and see the different versions in our session.

Steps Session 1 Session 2
1 BEGIN TRANSACTION
2 UPDATE InMemory Table
3 Read from InMemory Table
4 COMMIT TRANSACTION
5 Read from InMemory Table

Let us create the transaction to update the InMemory table. This is achieved as below:

-- Scenario 2
-- Session 1
BEGIN TRANSACTION
UPDATE
dbo.SalesOrder_inmem WITH (SNAPSHOT)
SET amount = 10000
WHERE order_id = 6

-- Session 2
SELECT * FROM dbo.SalesOrder_inmem

 SQL SERVER InMemory OLTP: Understanding Optimistic Multi Version Concurrency Concepts

The value of 10000 is NOT visible to the Session 2 till the transaction is committed on Session 1. This shows SQL Server’s capability to maintain multiple version for the records in an InMemory objects much more clearly.

Hope you were able to get a clearer understanding of how InMemory tables work in an optimistic concurrency model. Do let me know if you learnt something new or if I have missed anything that you would like to have more clarity on.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – dbForge Object Search for SQL Server

SSMS is a nice tool for database developers. However, it is far from being ideal, and this is because some routine operations in SSMS are extremely inconvenient to perform.

Very often, developers face the challenge of searching a database for all occurrences of an object, column, variable, or search simultaneously all the objects, in which a search value occurs. If you happen to solve a similar problem, then you must know that this problem is not the most trivial and Ctrl + F will not help here.

Of course, you can write a quick metadata query:

SELECT SCHEMA_NAME(o.[schema_id]), o.name, o.type_desc, m.[definition]
FROM sys.sql_modules m
JOIN sys.objects o ON m.[object_id] = o.[object_id]
WHERE [definition] LIKE '%PersonID%'

However, not all developers know metadata structure … and why waste time writing a query, if you already have an excellent free plug-in that helps to effectively search the wilds of DDL.

Though dbForge Object Search for SQL Server has been released not so long ago, it occupied the place of pride in my gentleman’s set.

This plug-in impresses me with the simplicity of use — type a text in the search box and click Enter:

devsearch01 SQL SERVER   dbForge Object Search for SQL Server

All the search results are displayed in a table that supports filtering. When you select a respective object, its DDL is displayed below… but not as a simple text. The plug-in has a convenient syntax highlighting.

If you need to restrict your search, you can configure the filtering by object type. For example, we can search only within stored procedures and triggers.

devsearch02 SQL SERVER   dbForge Object Search for SQL Server

Additional filters allow you to search much faster.

dbForge Object Search for SQL Server does not cache the information between the searches and directly accesses metadata. For me, it’s definitely a plus, especially when in active development and continuously updating database schema — you don’t need to constantly press Refresh cache to get the proper search results.

If necessary, you can do a search on multiple databases at the same time:

devsearch03 SQL SERVER   dbForge Object Search for SQL Server

The plug-in supports navigation. Simply select the context menu command Find in Database Explorer, and you will automatically jump to the found object:

devsearch04 SQL SERVER   dbForge Object Search for SQL Server

When working with this plug-in, I discovered some pleasant things. For example, previous search queries are saved in the search history:

devsearch05 SQL SERVER   dbForge Object Search for SQL Server

If we talk about the object search, doing it in SQL Server Management Studio is rather inconvenient. Queries dealing with this task are inefficient and require deep knowledge of the SQL Server system objects. By contrast, dbForge Object Search for SQL Server does the task brilliantly.

Reference: Pinal Dave (http://blog.sqlauthority.com)