SQLAuthority News – Developing Multi-tenant Applications for the Cloud, 3rd Edition – Book Download

Cloud is changing the way how IT world is evolving. The new startups start thinking in terms of the cloud from the inceptions. There are new opportunities as well there are few limitations when it is about cloud. Some organizations believe using cloud will give them an edge and some use the word as a marketing term. It does not matter what is the purpose the cloud is here for a long time. Having software on the cloud is now synonymous for performance and reliability. Honestly, just like every product and service, there are always the best practices to follow for optimal outcome.

Windows Azure provides Infrastructure as a Service (IaaS) support for both Windows Server and Linux operating systems. This means now user can “pay as you go” model where they have availability for scalability and elasticity. Any product is often successful when they are properly implemented. The challenge of implementation goes to the next level when it is multi-tenant application. It is very crucial now to create and manage balance as each tenant has different requirement and each of those has to balance with available resources. Security, Partitioning and Managing multi-tenant application on cloud requires expert guidance and proper directions.

Microsoft has recently released 3rd edition of the book which basically addresses precisely the same subject. I have been reading this book for a while and I find it quite interesting. There are a few interesting thoughts as well as some really good advices drafted in the book. Though the title of the book is very heavy the book is very easy to read and caricature of the author makes it very interesting.

Download Developing Multi-tenant Applications for the Cloud, 3rd Edition

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQLAuthority News – Download Whitepaper – SSIS Operational and Tuning Guide – SSIS for Azure and Hybrid Data Movement – Leveraging a Hadoop cluster from SSIS

There are three interesting Whitepaper recently released by Microsoft regarding SSIS. If you are using SSIS enthusiast and work with Hybrid data this three Whitepapers are very essential white-paper in the reference.

I am listing them here together for quick reference. The abstracts are built from the content of the white paper.

SSIS Operational and Tuning Guide

When transferring between a database and the cloud, data obviously is in transit.  This involves multiple phases, including pre-production testing, data loading, and data synchronization.  Sound complex?  SQL Server Integration Services (SSIS) is a tool created for moving data in and out of Windows Azure SQL Database, as part of the extract/transform/load (ETL) solution or as part of the data movement even when no transformations are needed.  It is effective for data in the cloud, all on your on-site databse, or a mix of the two.

There are already many processes involved with storing and data through the cloud.  Not only are there many “moving pieces” involved in the transfer, but it is also necessary to adjust your performance tuning knowledge to apply to a system that is no longer completely self contained, but is a common resource for a greater pool of users.  It is important to understand the best practices for cloud sources and hybrid data moves.

SSIS for Azure and Hybrid Data Movement

SQL Server Integration Services (SSIS) can be used effectively as a tool for moving data to and from Windows Azure SQL Database, as part of the total extract, transform, and load (ETL) solution and as part of the data movement solution. The Windows Azure (WA) platform poses several challenges with SSIS, but sever solutions as well.  Projects that move data between cloud and on-site storage involve many processes within all available solutions.  SSIS can be used to move data between sources and destinations in the cloud, as well as hybrid situations combining the two.  Because operating “in the cloud” can be extremely different from on-site database performance turning, it can require all new training to fully understand how best to use SSIS.

Leveraging a Hadoop cluster from SQL Server Integration Services (SSIS)

Remember when one gigabyte of data was an unheard of amount?  Now systems deal with terabytes and petabytes – but not quickly.  Querying this much data can take hours, especially when it is stored in Hadoop unstructured.  However, the same data can be stored, structured, in SQL Server and queried in seconds.  Thus, there is a need for data transfer between Hadoop and SQL Server.  SSIS, an ETL tool can be used to automate Hadoop and non-Hadoop jobs and manage data transfers.

Microsoft has coordinated with Hadoop to allow Hadoop to run on Windows Server and Azure, integrating with the rest of the Microsoft platform.  This allows users to download data into or out of any Windows program, like Excel or Word.  SSIS is another Windows tool that allows easy communication between Hadoop, and SQL Server, in this example.  This integration is sure to become a useful tool in any database administrator’s tool belt, and ought to be learned as early as possible.

Reference: Pinal Dave (http://blog.SQLAuthority.com)

SQLAuthority News – Download Whitepaper – Cleanse and Match Master Data by Using EIM

Master Data Services (MDM) and Data Quality Services (DQS) go hand to hand together when they have to maintain the integrity of the database. If you are new to either of concept I suggest you to read following two articles to get an idea about them.

Why Do We Need Master Data Management:
MDM was hailed as a major improvement for business intelligence. MDM comes into play because it will comb through these mountains of data and make sure that all the information is consistent, accurate, and all placed in one database so that employees don’t have to search high and low and waste their time.

Why Do We Need Data Quality Services:
There are two parts of Data Quality Services that help you accomplish all these neat things.  The first part is DQL Server, which you can think of as the hardware component of the system. DQS Client is the user interface that you can interact with to set the rules and check over your data.

MDM working along with DQS

To help you understand how Master Data Services and Data Quality Services work together to ensure high-quality master data, this paper looks at four common implementations of these tools: one for entering new master data in a new Master Data Services entity, and three for updating existing master data as new data comes in.

  • Create and build a new master data model and an entity within the model that will contain the data for a field, using Master Data Services and Data Quality Services
  • Update an existing master data entity in an EIM automated lookup scenario using Integration Services, Master Data Services, and Data Quality Services
  • Include checks for close matches when updating an existing master data entity, by adding fuzzy matching to the EIM automated update scenario
  • Perform matching on an existing entity from within the Master Data Services Add-in for Excel.

Download Whitepaper – Cleanse and Match Master Data by Using EIM

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQLAuthority News – Download Whitepaper – SQL Server Analysis Services to Hive

The SQL Server Analysis Service is a very interesting subject and I always have enjoyed learning about it. You can read my earlier article over here. Big Data is my new interest and I have been exploring it recently. During this weekend this blog post caught my attention and I enjoyed reading it.

Big Data is the next big thing. The growth is predicted to be 60% per year till 2016. There is no single solution to the growing need of the big data available in the market right now as well there is no one solution in the business intelligence eco-system available as well. However, the need of the solution is ever increasing. I am personally Klout user. You can see my Klout profile over. I do understand what Klout is trying to achieve – a single place to measure the influence of the person. However, it works a bit mysteriously. There are plenty of social media available currently in the internet world. The biggest problem all the social media faces is that everybody opens an account but hardly people logs back in. To overcome this issue and have returned visitors Klout has come up with the system where visitors can give 5/10 K+ to other users in a particular area. Looking at all the activities Klout is doing it is indeed big consumer of the Big Data as well it is early adopter of the big data and Hadoop based system.

 Klout has to 1 trillion rows of data to be analyzed as well have nearly thousand terabyte warehouse. Hive the language used for Big Data supports Ad-Hoc Queries using HiveQL there are always better solutions. The alternate solution would be using SQL Server Analysis Services (SSAS) along with HiveQL. As there is no direct method to achieve there are few common workarounds already in place. A new ODBC driver from Klout has broken through the limitation and SQL Server Relation Engine can be used as an intermediate stage before SSAS.

In this white paper the same solutions have been discussed in the depth. The white paper discusses following important concepts.

  • The Klout Big Data solution
  • Big Data Analytics based on Analysis Services
  • Hadoop/Hive and Analysis Services integration
  • Limitations of direct connectivity
  • Pass-through queries to linked servers
  • Best practices and lessons learned

This white paper discussed all the important concepts which have enabled Klout to go go to the next level with all the offerings as well helped efficiency by offering a few out of the box solutions. I personally enjoy reading this white paper and I encourage all of you to do so.

SQL Server Analysis Services to Hive

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQLAuthority News – Download Whitepaper – Power View Infrastructure Configuration and Installation: Step-by-Step and Scripts

Power View, a feature of SQL Server 2012 Reporting Services Add-in for Microsoft SharePoint Server 2010 Enterprise Edition, is an interactive data exploration, visualization, and presentation experience. It provides intuitive ad-hoc reporting for business users such as data analysts, business decision makers, and information workers. Microsoft has recently released very interesting whitepaper which covers a sample scenario that validates the connectivity of the Power View reports to both PowerPivot workbooks and tabular models. This white paper talks about following important concepts about Power View:

  • Understanding the hardware and software requirements and their download locations
  • Installing and configuring the required infrastructure when Power View and its data models are on the same computer and on different computer
  • Installing and configuring a computer used for client access to Power View reports, models, Sharepoint 2012 and Power View in a workgroup
  • Configuring single sign-on access for double-hop scenarios with and without Kerberos

You can download the whitepaper from here.

This whitepaper talks about many interesting scenarios. It would be really interesting to know if you are using Power View in your production environment. If yes, would you please share your experience over here.

Reference: Pinal Dave (http://blog.SQLAuthority.com)

SQLAuthority News – Weekend Experiment with NuoDB – Points to Pondor and Whitepaper

This weekend I have downloaded the latest beta version of NuoDB. I found it much improved and better UI. I was very much impressed as the installation was very smooth and I was up and running in less than 5 minutes with the product. The tools which are related to the Administration of the NuoDB seems to get makeover during this beta release. As per the claim they support now Solaris platform and have improved the native MacOS installation. I neither have Mac nor Solaris – I wish I would have experimented with the same. I will appreciate if anyone out there can confirm how the installations goes on these platforms.

I have previously blogged about my experiment with NuoDB here:

I am very impressed with the product so far and I have decided to understand the product further deep. Here are few of the questions which I am going to try to find answers with regards to NuoDB. Just so it is clear – NuoDB is not NOSQL, matter of the fact, it is following all the ACID properties of the database.

  • If ACID properties are crucial why many NoSQL products are not adhering to it? (There are few out there do follow ACID but not all).
  • I do understand the scalability of the database however does elasticity is crucial for the database and if yes how? (Elasticity is where the workload on the database is heavily fluctuating and the need of more than a single database server is coming up).
  • How NuoDB has built scalable, elastic and 100% ACID compliance database which supports multiple platforms?
  • How is NOSQL compared to NuoDB’s new architecture?

In the next coming weeks, I am going to explore above concepts and dive deeper into the understanding of the same. Meanwhile I have read following white paper written by Experts at University of California at Santa Barbara. Very interesting read and great starter on the subject Database Scalability, Elasticity, and Autonomy in the Cloud.

Additionally, my questions are also talking about NoSQL, this weekend I have started to learn about NoSQL from Pluralsight‘s online learning library. I will share my experience very soon.

Reference: Pinal Dave (http://blog.SQLAuthority.com)

SQLAuthority News – 2 Whitepapers Announced – AlwaysOn Architecture Guide: Building a High Availability and Disaster Recovery Solution

Understanding AlwaysOn Architecture is extremely important when building a solution with failover clusters and availability groups. Microsoft has just released two very important white papers related to this subject. Both the white papers are written by top experts in industry and have been reviewed by excellent panel of experts. Every time I talk with various organizations who are adopting the SQL Server 2012 they are always excited with the concept of the new feature AlwaysOn. One of the requests I often here is the related to detailed documentations which can help enterprises to build a robust high availability and disaster recovery solution. I believe following two white paper now satisfies the request.

AlwaysOn Architecture Guide: Building a High Availability and Disaster Recovery Solution by Using AlwaysOn Availability Groups

SQL Server 2012 AlwaysOn Availability Groups provides a unified high availability and disaster recovery (HADR) solution. This paper details the key topology requirements of this specific design pattern on important concepts like quorum configuration considerations, steps required to build the environment, and a workflow that shows how to handle a disaster recovery.

AlwaysOn Architecture Guide: Building a High Availability and Disaster Recovery Solution by Using Failover Cluster Instances and Availability Groups

SQL Server 2012 AlwaysOn Failover Cluster Instances (FCI) and AlwaysOn Availability Groups provide a comprehensive high availability and disaster recovery solution. This paper details the key topology requirements of this specific design pattern on important concepts like asymmetric storage considerations, quorum model selection, quorum votes, steps required to build the environment, and a workflow.

If you are not going to implement AlwaysOn feature, this two Whitepapers are still a great reference material to review as it will give you complete idea regarding what it takes to implement AlwaysOn architecture and what kind of efforts needed. One should at least bookmark above two white papers for future reference.

Reference: Pinal Dave (http://blog.sqlauthority.com)