SQLAuthority News – Microsoft Whitepaper – Migrating Content Between Report Servers

A very common challenge developer and administrator face when they have to migrate content between SQL Server Reporting Server. This is because SQL Server Reporting Services (SSRS) currently doesn’t include a feature for migrating content items and settings from one Reporting Services report server to another report server. There are two different methods to do overcome this challenge. 1) Create RSS scripts that are used by rs.exe to migrate the content of Reporting Services, between reports servers of the same mode or different modes. 2) Run Reporting Services Migration Tool to migrate the content from a Native mode report server to a SharePoint mode report server.

The white paper discusses the Reporting Services Migration tool and various aspects of method of migration like the location of the tool, supportability of the tool, a list of contents migrated by the tool, migration steps and known issues with the tool. You can download the tool from here.

Click here to download and read the white paper

Reference: Pinal Dave (http://blog.SQLAuthority.com)

About these ads

SQLAuthority News – Microsoft Whitepaper – Idle Connection Resiliency

Developers can write data access application that enables ideal connection resiliency with the .NET framework. An idle connection is the one that is active but it’s not executing a command or waiting for data. It is very important to understand how idle connection is reconnecting back in the .NET framework with SQL Server. This white paper actually discusses the same in very simple works. The user has to connect either to a SQL Server 2014 or Microsoft Azure, SQL Database to enable idle connection resiliency.

Here is a very interesting example in the of the idle connection resiliency provided in the Overview section of the Whitepaper.

Let’s imagine that you are a roaming worker that needs to use an Access application to connect to SQL Server.  When you need to move from meeting to meeting, you normally close your notebook’s lid in order to move.  In working online, every time this happens, you may end up disconnected either because your notebooks sleep or due to blind wireless spots in your building.  To avoid the hassle of being disconnected, you may choose to avoid certain places (like elevators, for example) and walk with your notebook’s lid open. Now, imagine if you can close your lid and walk anywhere in your building (even take the elevator) and just arrive to your next meeting, open your lid and find your work there, waiting for you to continue.  To  address this and other scenarios when an idle connection drops, SQL Server introduced a new feature called Idle Connection Resiliency.

Well, that’s it. This white paper describes the internal working of the Idle Connection Resiliency. It further discusses about the Client’s idle connection, reconnect logic, Client session state handling and replay logic, Non-recoverable session states, and General Considerations.

Click here to read the white paper on Idle Connection Resiliency.

Reference: Pinal Dave (http://blog.SQLAuthority.com)

SQLAuthority News – 4 Whitepapers on Windows Azure

Windows Azure is getting quite a lot of traction in the industry and I often get questions about if there is authentic learning on this subject. Microsoft has published following 4 white paper which covers many interesting topics related to SQL Server and Windows Azure.

Application Patterns and Development Strategies for SQL Server in Windows Azure Virtual Machines
You can develop many types of n-tier applications by separating the components of the different application tiers on different machines as well as in separate components. This article explains different application patterns to use for your SQL Server based applications in Windows Azure environment. In addition, the article discusses Windows Azure specific development strategies so that you can design your applications correctly.

Deploy SQL Server Business Intelligence in Windows Azure Virtual Machines
This article describes and walks you through the creation of a multi-server deployment of SQL Server Business Intelligence features, in a Windows Azure Virtual Machines environment. The document focuses on the use of Windows PowerShell scripts for each step of the configuration and deployment process.

How to backup and restore Windows system disk in a Windows Azure IaaS
A comparison of different methods of backup and restore in an IaaS environment.

Performance Guidance for SQL Server in Windows Azure Virtual Machines
This article discusses the key factors to consider when evaluating performance and planning a migration to SQL Server in Windows Azure Virtual Machines. It also provides certain best practices and techniques for performance tuning and troubleshooting when using SQL Server in Windows Azure Infrastructure Services.

All the taxt is from MSDN.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQLAuthority News – Whitepaper – Plan Caching and Recompilation in SQL Server 2012

Plan Caching and Recompilation in SQL Server 2012 Whitepaper has been my favorite paper for a long time. Plan caching and recompilation are two of the best concepts which are explained in depth by my favorite author Greg Low. I have met Greg several times and I have been a big fan of his writing and ability to make complex very easy. The same white paper was earlier available for SQL Server 2005 and 2008.

This paper explains how SQL Server allocates memory for plan caching, how query batches are cached and suggests best practices on maximizing reuse of cached plans. It also explains scenarios in which batches are recompiled, and gives best practices for reducing or eliminating unnecessary recompilations and for minimizing plan cache pollution. The white paper explains SQL Server’s statement-level recompilation feature and many tools and utilities that are useful as observation tools in the processes of query compilation, query recompilation, plan caching, and plan reuse. (Abstract from whitepaper).

I strongly encourage everybody to read this whitepaper if you are interested in the performance tuning as well interested in learning about the essentials of the query optimization.

Download the Plan Caching and Recompilation in SQL Server 2012 Whitepaper.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQLAuthority News – Developing Multi-tenant Applications for the Cloud, 3rd Edition – Book Download

Cloud is changing the way how IT world is evolving. The new startups start thinking in terms of the cloud from the inceptions. There are new opportunities as well there are few limitations when it is about cloud. Some organizations believe using cloud will give them an edge and some use the word as a marketing term. It does not matter what is the purpose the cloud is here for a long time. Having software on the cloud is now synonymous for performance and reliability. Honestly, just like every product and service, there are always the best practices to follow for optimal outcome.

Windows Azure provides Infrastructure as a Service (IaaS) support for both Windows Server and Linux operating systems. This means now user can “pay as you go” model where they have availability for scalability and elasticity. Any product is often successful when they are properly implemented. The challenge of implementation goes to the next level when it is multi-tenant application. It is very crucial now to create and manage balance as each tenant has different requirement and each of those has to balance with available resources. Security, Partitioning and Managing multi-tenant application on cloud requires expert guidance and proper directions.

Microsoft has recently released 3rd edition of the book which basically addresses precisely the same subject. I have been reading this book for a while and I find it quite interesting. There are a few interesting thoughts as well as some really good advices drafted in the book. Though the title of the book is very heavy the book is very easy to read and caricature of the author makes it very interesting.

Download Developing Multi-tenant Applications for the Cloud, 3rd Edition

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQLAuthority News – Download Whitepaper – SSIS Operational and Tuning Guide – SSIS for Azure and Hybrid Data Movement – Leveraging a Hadoop cluster from SSIS

There are three interesting Whitepaper recently released by Microsoft regarding SSIS. If you are using SSIS enthusiast and work with Hybrid data this three Whitepapers are very essential white-paper in the reference.

I am listing them here together for quick reference. The abstracts are built from the content of the white paper.

SSIS Operational and Tuning Guide

When transferring between a database and the cloud, data obviously is in transit.  This involves multiple phases, including pre-production testing, data loading, and data synchronization.  Sound complex?  SQL Server Integration Services (SSIS) is a tool created for moving data in and out of Windows Azure SQL Database, as part of the extract/transform/load (ETL) solution or as part of the data movement even when no transformations are needed.  It is effective for data in the cloud, all on your on-site databse, or a mix of the two.

There are already many processes involved with storing and data through the cloud.  Not only are there many “moving pieces” involved in the transfer, but it is also necessary to adjust your performance tuning knowledge to apply to a system that is no longer completely self contained, but is a common resource for a greater pool of users.  It is important to understand the best practices for cloud sources and hybrid data moves.

SSIS for Azure and Hybrid Data Movement

SQL Server Integration Services (SSIS) can be used effectively as a tool for moving data to and from Windows Azure SQL Database, as part of the total extract, transform, and load (ETL) solution and as part of the data movement solution. The Windows Azure (WA) platform poses several challenges with SSIS, but sever solutions as well.  Projects that move data between cloud and on-site storage involve many processes within all available solutions.  SSIS can be used to move data between sources and destinations in the cloud, as well as hybrid situations combining the two.  Because operating “in the cloud” can be extremely different from on-site database performance turning, it can require all new training to fully understand how best to use SSIS.

Leveraging a Hadoop cluster from SQL Server Integration Services (SSIS)

Remember when one gigabyte of data was an unheard of amount?  Now systems deal with terabytes and petabytes – but not quickly.  Querying this much data can take hours, especially when it is stored in Hadoop unstructured.  However, the same data can be stored, structured, in SQL Server and queried in seconds.  Thus, there is a need for data transfer between Hadoop and SQL Server.  SSIS, an ETL tool can be used to automate Hadoop and non-Hadoop jobs and manage data transfers.

Microsoft has coordinated with Hadoop to allow Hadoop to run on Windows Server and Azure, integrating with the rest of the Microsoft platform.  This allows users to download data into or out of any Windows program, like Excel or Word.  SSIS is another Windows tool that allows easy communication between Hadoop, and SQL Server, in this example.  This integration is sure to become a useful tool in any database administrator’s tool belt, and ought to be learned as early as possible.

Reference: Pinal Dave (http://blog.SQLAuthority.com)

SQLAuthority News – Download Whitepaper – Cleanse and Match Master Data by Using EIM

Master Data Services (MDM) and Data Quality Services (DQS) go hand to hand together when they have to maintain the integrity of the database. If you are new to either of concept I suggest you to read following two articles to get an idea about them.

Why Do We Need Master Data Management:
MDM was hailed as a major improvement for business intelligence. MDM comes into play because it will comb through these mountains of data and make sure that all the information is consistent, accurate, and all placed in one database so that employees don’t have to search high and low and waste their time.

Why Do We Need Data Quality Services:
There are two parts of Data Quality Services that help you accomplish all these neat things.  The first part is DQL Server, which you can think of as the hardware component of the system. DQS Client is the user interface that you can interact with to set the rules and check over your data.

MDM working along with DQS

To help you understand how Master Data Services and Data Quality Services work together to ensure high-quality master data, this paper looks at four common implementations of these tools: one for entering new master data in a new Master Data Services entity, and three for updating existing master data as new data comes in.

  • Create and build a new master data model and an entity within the model that will contain the data for a field, using Master Data Services and Data Quality Services
  • Update an existing master data entity in an EIM automated lookup scenario using Integration Services, Master Data Services, and Data Quality Services
  • Include checks for close matches when updating an existing master data entity, by adding fuzzy matching to the EIM automated update scenario
  • Perform matching on an existing entity from within the Master Data Services Add-in for Excel.

Download Whitepaper – Cleanse and Match Master Data by Using EIM

Reference: Pinal Dave (http://blog.sqlauthority.com)