SQLAuthority News – Download Whitepaper – Power View Infrastructure Configuration and Installation: Step-by-Step and Scripts

Power View, a feature of SQL Server 2012 Reporting Services Add-in for Microsoft SharePoint Server 2010 Enterprise Edition, is an interactive data exploration, visualization, and presentation experience. It provides intuitive ad-hoc reporting for business users such as data analysts, business decision makers, and information workers. Microsoft has recently released very interesting whitepaper which covers a sample scenario that validates the connectivity of the Power View reports to both PowerPivot workbooks and tabular models. This white paper talks about following important concepts about Power View:

  • Understanding the hardware and software requirements and their download locations
  • Installing and configuring the required infrastructure when Power View and its data models are on the same computer and on different computer
  • Installing and configuring a computer used for client access to Power View reports, models, Sharepoint 2012 and Power View in a workgroup
  • Configuring single sign-on access for double-hop scenarios with and without Kerberos

You can download the whitepaper from here.

This whitepaper talks about many interesting scenarios. It would be really interesting to know if you are using Power View in your production environment. If yes, would you please share your experience over here.

Reference: Pinal Dave (http://blog.SQLAuthority.com)

About these ads

SQL SERVER – Step by Step Guide to Beginning Data Quality Services in SQL Server 2012 – Introduction to DQS

Data Quality Services is a very important concept of SQL Server. I have recently started to explore the same and I am really learning some good concepts. Here are two very important blog posts which one should go over before continuing this blog post.

This article is introduction to Data Quality Services for beginners. We will be using an Excel file

Click on the image to enlarge the it.

In the first article we learned to install DQS. In this article we will see how we can learn about building Knowledge Base and using it to help us identify the quality of the data as well help correct the bad quality of the data.

Here are the two very important steps we will be learning in this tutorial.

  • Building a New Knowledge Base 
  • Creating a New Data Quality Project

Let us start the building the Knowledge Base. Click on New Knowledge Base.

In our project we will be using the Excel as a knowledge base. Here is the Excel which we will be using. There are two columns. One is Colors and another is Shade. They are independent columns and not related to each other. The point which I am trying to show is that in Column A there are unique data and in Column B there are duplicate records.


Clicking on New Knowledge Base will bring up the following screen. Enter the name of the new knowledge base.

Clicking NEXT will bring up following screen where it will allow to select the EXCE file and it will also let users select the source column. I have selected Colors and Shade both as a source column.

Creating a domain is very important. Here you can create a unique domain or domain which is compositely build from Colors and Shade.

As this is the first example, I will create unique domain – for Colors I will create domain Colors and for Shade I will create domain Shade.

Here is the screen which will demonstrate how the screen will look after creating domains.

Clicking NEXT it will bring you to following screen where you can do the data discovery. Clicking on the START will start the processing of the source data provided.

Pre-processed data will show various information related to the source data. In our case it shows that Colors column have unique data whereas Shade have non-unique data and unique data rows are only two.

In the next screen you can actually add more rows as well see the frequency of the data as the values are listed unique.

Clicking next will publish the knowledge base which is just created.

Now the knowledge base is created. We will try to take any random data and attempt to do DQS implementation over it. I am using another excel sheet here for simplicity purpose. In reality you can easily use SQL Server table for the same.

Click on New Data Quality Project to see start DQS Project.

In the next screen it will ask which knowledge base to use. We will be using our Colors knowledge base which we have recently created.

In the Colors knowledge base we had two columns – 1) Colors and 2) Shade. In our case we will be using both of the mappings here. User can select one or multiple column mapping over here.

Now the most important phase of the complete project. Click on Start and it will make the cleaning process and shows various results.

In our case there were two columns to be processed and it completed the task with necessary information. It demonstrated that in Colors columns it has not corrected any value by itself but in Shade value there is a suggestion it has. We can train the DQS to correct values but let us keep that subject for future blog posts.

Now click next and keep the domain Colors selected left side. It will demonstrate that there are two incorrect columns which it needs to be corrected. Here is the place where once corrected value will be auto-corrected in future.

I manually corrected the value here and clicked on Approve radio buttons. As soon as I click on Approve buttons the rows will be disappeared from this tab and will move to Corrected Tab. If I had rejected tab it would have moved the rows to Invalid tab as well.

In this screen you can see how the corrected 2 rows are demonstrated. You can click on Correct tab and see previously validated 6 rows which passed the DQS process.

Now let us click on the Shade domain on the left side of the screen. This domain shows very interesting details as there DQS system guessed the correct answer as Dark with the confidence level of 77%. It is quite a high confidence level and manual observation also demonstrate that Dark is the correct answer. I clicked on Approve and the row moved to corrected tab.

On the next screen DQS shows the summary of all the activities. It also demonstrates how the correction of the quality of the data was performed. The user can explore their data to a SQL Server Table, CSV file or Excel.

The user also has an option to either explore data and all the associated cleansing info or data only. I will select Data only for demonstration purpose.

Clicking explore will generate the files.

Let us open the generated file. It will look as following and it looks pretty complete and corrected.

Well, we have successfully completed DQS Process. The process is indeed very easy. I suggest you try this out yourself and you will find it very easy to learn. In future we will go over advanced concepts.

Are you using this feature on your production server? If yes, would you please leave a comment with your environment and business need. It will be indeed interesting to see where it is implemented.

Reference: Pinal Dave (http://blog.SQLAuthority.com)

SQLAuthority News – Download Whitepaper – Choosing a Tabular or Multidimensional Modeling Experience in SQL Server 2012 Analysis Services

Data modeling is the most important task for any BI professional. Matter of the fact, the biggest challenge is to organizing disparate data into an analytic model that effectively and efficiently supports the reporting and analysis. SQL Server 2012 introduces BI Semantic Model (BISM), a single model that can support a broad range of reporting and analysis while blending two Analysis Services modeling experiences behind the scenes.

  • Multidimensional modeling – enables BI professionals to create sophisticated multidimensional cubes using traditional online analytical processing (OLAP).
  • Tabular modeling – provides self-service data modeling capabilities to business and data analysts.
As data modeling is evolving and business needs are growing new technologies and tools are emerging to help end users to make the necessary adjustment to the reporting and analysis needs. This white paper is will provide practical guidance to help you decide which SQL Server 2012 Analysis Services modeling experience – tabular or multidimensional.

Do let me know what do is your opinion as a comment. In simple word – I would like to know when will you use Tabular modeling and when Multidimensional modeling?

Download Choosing a Tabular or Multidimensional Modeling Experience in SQL Server 2012 Analysis Services

Reference: Pinal Dave (http://blog.SQLAuthority.com)

SQLAuthority News – Business Intelligence features of SQL Server 2012 RC0 – Download Virtual Machine

I am in front of computer 16+ hours of the day. However, I accept that I am bit lazy when it is about doing installation etc. I always prefer that IT department help me to install my computer and I use it right away. Same feeling I get when I have to install Beta, RC or any other version in my computer. I prefer virtual machines for the same. Again, I do not like to prepare virtual machines. Now following news is very exciting. As I Microsoft has prepared virtual machine for all of us using all essential Business Intelligence features of SQL Server 2012 Rc0.This virtual machine is ready with PowerView reports, Powerpivot Excel documents. Additionally it has all the content packs which helps you explore various business intelligence capability.

Power View:

Power View provides intuitive ad-hoc reporting for business users such as data analysts, business decision makers, and information workers. They can easily create and interact with views of data from models based on PowerPivot workbooks or tabular models deployed to SQL Server 2012 Analysis Services (SSAS) Release Candidate 0 (RC0) instances. Power View is a browser-based Silverlight application launched from SharePoint Server 2010.

PowerPivot:

PowerPivot for Microsoft Excel 2010 provides ground-breaking technology, such as fast manipulation of large data sets (often millions of rows), streamlined integration of data, and the ability to effortlessly share your analysis through Microsoft SharePoint 2010.

Download BI Virtual Machine

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER 2012 Editions – Highlights of The Cloud-Ready Information Platform

Microsoft has just announced SQL Server 2012 Editions information on official SQL Server 2012 site.

SQL Server 2012 will be available in three main editions:

  1. Enterprise
  2. Business Intelligence
  3. Standard

The other editions are Web, Developer and Express.

Here is the salient features of each of the edition:

Enterprise

  • Advanced high availability with AlwaysOn
  • High performance data warehousing with ColumnStore
  • Maximum virtualization (with Software Assurance)
  • Inclusive of Business Intelligence edition’s capabilities

Business Intelligence

  • Rapid data discovery with Power View
  • Corporate and scalable reporting and analytics
  • Data Quality Services and Master Data Services
  • Inclusive of the Standard edition’s capabilities

Standard

  • Standard continues to offer basic database, reporting and analytics capabilities

There is comparison chart of various other aspect of the above editions. Please refer here.

Additionally SQL Server 2012 licensing is also explained here.

Reference: Pinal Dave (http://blog.SQLAuthority.com)

SQL SERVER – SSAS – Multidimensional Space Terms and Explanation

I was presenting on SQL Server session at one of the Tech Ed On Road event in India. I was asked very interesting question during ‘Stump the Speaker‘ session. I am sharing the same with all of you over here.

Question: Can you tell me in simple words what is dimension, member and other terms of multidimensional space? There is no simple example for it.

This is extreme fundamental question if you know Analysis Service. Those who have no exposure to the same and have not yet started on this subject, may find it a bit difficult. I really liked his question so I decided to answer him there as well blog about the same over here.

Answer: Here are the most important terms of multidimensional space – dimension, member, value, attribute and size.

Dimension – It describes the point of interests for analysis.

Member – It is one of the point of interests in the dimension.

Value – It uniquely describes the member.

Attribute – It is collection of multiple members.

Size – It is total numbers for any dimension.

Let us understand this further detail taking example of any space. I am going to take example of distance as a space in our example.

Dimension – Distance is a dimension for us.
Member – Kilometer – We can measure distance in Kilometer.
Value – 4 – We can measure distance in the kilometer unit and the value of the unit can be 4.
Attribute – Kilometer, Miles, Meter – The complete set of members is called attribute.
Size – 100 KM – The maximum size decided for the dimension is called size.

The same example can be also defined by using time space. Here is the example using time space.

Dimension – Time
Member – Date
Value – 25
Attribute – 1, 2, 3…31
Size – 31

I hope it is clear enough that what are various multidimensional space and its terms.

Reference: Pinal Dave (http://blog.SQLAuthority.com)

SQL SERVER – BI Quiz – Troubleshooting Cube Performance

My friend Jacob Sebastian runs SQL BI Quiz competition. Where there are 30 different questions on each day of the month. Winners get opportunity to participate in this Quiz, learn something new and win great awards.

Working with huge data is very common when it is about Data Warehousing. It is necessary to create Cubes on the data to make it meaningful and consumable. There are cases when retrieving the data from cube takes lots of the time.

Let us assume that your cube is returning you data very quickly. Suddenly on one day it is returning the data very slowly. What are the three things will you in order to diagnose this. After diagnose what you will do to resolve performance issue.

Participate in my question over here

Reference: Pinal Dave (http://blog.SQLAuthority.com)