SQL SERVER – SSMS: Activity – All Blocking Transactions

Working out of India has its own challenges and I enjoy here despite these challenges thrown at me. One of the biggest advantage I have working with Pluralsight is, I can still get my job done by working-from-home occasionally. And this is one of the perks I wish most of the companies give their employees. You might be thinking why I am doing this, well the obvious answer to this question relies on the fact how the previous day went. If it rained heavily, which is does in Bengaluru in July, then the chances are that roads would have a build-up of traffic the next day morning. Taking traffic away from your life is never so easy, but with technology improvements like Maps on the phone, I still manage to get an alternate route to reach my destination. This is what makes life interesting and the exploration into new places always fun.

I just wish SQL Server had some way of achieving the same. Blocking and Locking are fundamental to keeping databases in sync and consistent. This blog is all about Blocking Transactions report from the instance level.

To access the report, get to Server node -> Reports -> Standard Reports -> Activity – All Blocked Transactions.

From this node, if there are no apparent blocking happening in the system at the point this report was run, we will be presented with a “Blank” output as shown below.

The ideal situation for us to be in this state, even for a transitional system, but this will never be the case in reality. For a highly transactional systems which try to modify / insert data in same table, SQL Server will respect the order in which the request came and will not allow incompatible locks to exist at the same time. So this behaviour creates a queue automatically and this is what we call as Blocking.

This brings us to the next output, where we are having multiple transactions running. To show some data in report from my non-production-workload system, I have simulated a blocking scenario using two statements. In such a scenario you can see there are two regions to look at: the Session ID of 52, 53 and 54. From the hierarchy, we know that 52 is blocking both 53 and 54. We can also know there are 2 “#Directly Blocked Transactions” in the system currently from the top row for SPID 52. If there are additional transactions trying to insert or delete, then this will show the complete chain of tractions currently blocked.

We also get to see the type of statement that is waiting in this blocking scenario. In the diagram below we see the two statements involved are – INSERT and DELETE.

Various DMVs which have been used to get this information are sys.dm_tran_locks, sys.dm_tran_active_transactions, sys.dm_tran_session_transactions, sys.dm_tran_database_transactions and sys.dm_exec_requests. Along with above, report also uses DMF sys.dm_exec_sql_text to convert the SQL handle to more meaningful text.
If that was not enough then we can also head to the Activity Monitor and expand the Processes tab to get similar information. It is evident that the head of blocking is 52 whereas 53 and 54 are waiting on 52. It is completely up to us to decide what we need to do. We can Kill process 52 and the other transactions will go through.

As a small note, the Task States can give us vital information of what is happening in the system. Some of the states are worth mentioning:

Sleeping This shows the SPID is waiting for a command or nothing is currently executing.
Running SPID is currently running.
Suspended SPID is waiting for locks or a latch.
Rollback Connection is in rollback state of a transaction.

You can use the state information to take an informed decision of killing a process if required.

At this moment, yet another blog post that is worth a mention is Blocked Process Threshold post. This option makes sure there is a profiler event raised when a request is blocked beyond a predefined period of time. So do take a look at that too if you are interested in that behaviour.

The reports series is catching up and the learnings are multi-fold for me personally. Subsequent posts I will get into the other reports and give you my learnings.

Reference: Pinal Dave (http://blog.sqlauthority.com)

About these ads

SQL SERVER – Activity Reports – Dormant Sessions

With schools starting for my daughter, I rarely get involved in her daily routine activity. But there is one thing that I don’t try to miss – the parents-teachers meet. Sometimes it is not about the report card on how my daughter faired against the rest in the class but it is more than that. I am curious to understand how she behaves in the class, how she makes friends in class, how her extra-curricular activities are, is she disciplined overall etc. Those are key attributes and traits I am looking at getting as feedback from the teachers in that hour of interactions.

In the same lines, there are tons of other parameters one needs to be aware off with working with SQL Server. A lot of times DBA’s when they are doing maintenance or monitoring of servers, they need help around who is currently accessing the server, what are the inactive sessions, what are the connections with the most resources, all the active sessions on the server and more. This blog will answer these questions. Here are the reports we would talk about:

  1. Activity – All Cursors
  2. Activity – Top Cursors
  3. Activity – All Sessions
  4. Activity – Top Sessions
  5. Activity – Dormant Sessions
  6. Activity – Top Connections

Activity – All Cursors

This report shows information about the cursors used in SQL Server. They are looping construct in T-SQL world. I have probably heard many times, from different sources, as a best practice; avoid using TSQL cursors. In my opinion, there could be situation where cursors might out-perform as compared to other looping constructs. For example, a cursor would be a good candidate for row-by-row processing that can’t be performed by set based operations. We get flexibility via cursor as it provides a subset of data and that allows manipulation of the data in different ways. Having said that, do perform your own performance tests before using the same – these recommendations have to be used with a pinch of salt rather than as written on stone.

The heart of this report is DMV sys.dm_exec_cursors which has a lot of information available about the cursors that are open in various databases. The reports also uses below DMVs.

sys.dm_exec_sessions To get login name
sys.dm_exec_sql_text To get text of the statement via sql_handle

For seeing the sample data into the report, we can run below query

DECLARE cur CURSOR
FOR SELECT
name FROM sys.objects
DECLARE @temp SYSNAME
OPEN
cur
FETCH NEXT FROM cur INTO @temp
WHILE @@fetch_status >= 0
BEGIN
FETCH
NEXT FROM cur INTO @temp
WAITFOR delay '00:00:01'
END
CLOSE
cur
DEALLOCATE cur

All the values shown are explained in documentation of sys.dm_exec_cursors.

Activity - Top Cursors

This report is same as earlier report and only difference is that we can see them categorized as below.

  1. Top 10 Oldest Cursors – This shows the oldest cursor the on SQL.  (Order by creation_time)
  2. Top 10 Dormant Cursors – shows Cursor sitting idle since last query (open or fetch) (Order by worker_time)
  3. Top 10 IO Intensive Cursors – Shows cursors that are consuming the most IO resources. (Order by reads + writes)
  4. Top 10 CPU Intensive Cursors - Shows cursors that are consuming the most CPU resources. (Order by dormant_duration)

All four sections run exactly same query with different order by clause (which I mentioned in definition) by DMV sys.dm_exec_cursors.

Activity - All Sessions

As the name says – this report shows the details of all sessions, connections, requests and the statements currently active in the server.

This report provides details on all active user sessions on the Instance organized by Login. Since I have started two different login “SlowIO” and “sa”, we are seeing the report shows two groups (highlighted). We can drill down to each group till statement level.  Under the hood it uses sys.dm_exec_sessions,

sys.dm_exec_connections and sys.dm_exec_requests DMVs.

Activity - Top Sessions

SELECT TOP 10 s.session_id,
s.login_time,
s.HOST_NAME,
s.program_name,
s.cpu_time             AS cpu_time,
s.memory_usage * 8     AS memory_usage,
s.total_scheduled_time AS total_scheduled_time,
s.total_elapsed_time   AS total_elapsed_time,
s.last_request_end_time,
s.reads,
s.writes,
COUNT(c.connection_id) AS conn_count
FROM   sys.dm_exec_sessions s
LEFT OUTER JOIN sys.dm_exec_connections c
ON ( s.session_id = c.session_id)
LEFT OUTER JOIN sys.dm_exec_requests r
ON ( r.session_id = c.session_id)
WHERE  ( s.is_user_process = 1)
GROUP  BY s.session_id,
s.login_time,
s.HOST_NAME,
s.cpu_time,
s.memory_usage,
s.total_scheduled_time,
s.total_elapsed_time,
s.last_request_end_time,
s.reads,
s.writes,
s.program_name

Here are the various order by clauses added in each section. You can do it yourself as well.

  1. Top Oldest Sessions (order by s.login_time asc)
  2. Top CPU Consuming Sessions (order by s.cpu_time desc)
  3. Top Memory Consuming Sessions (order by s.memory_usage desc)
  4. Top Sessions By # Reads (order by s.reads  desc)
  5. Top Sessions By # Writes (order by s.writes desc)

Activity – Dormant Sessions

This is an interesting report and shows dormant sessions in SQL Server. Dormant session is a session which has connected earlier, ran some query and sitting idle. This report provides details on Sessions that have been inactive for more than an hour. Behind the scene, the report uses sys.dm_exec_sessions and puts filter on datediff(mi, last_request_end_time, @d1) >= 60 to get dormant sessions.

As shown above, there are three sections in the report. In the first section (1), we can see number of All Sessions, number of Dormant Sessions which are there from more than 1 hour and number of users with Dormant Sessions. This might be different from number of sessions, because single login might have more than one session open at a point in time. The second section (2) shows the Top 10 Dormant Sessions. All of the columns are self-explanatory. Third section (3) shows top 10 dormant sessions by user name. This would be useful in development servers where we use user name to find who is connected.

Activity – Top Connections

This is last Activity report in the list. Earlier reports are based on sessions and this report is based on connections. Since this report is similar, I would not explain much.

Here is the base query used by report

SELECT TOP 10 ( Row_number()
OVER(
ORDER BY c.connect_time) )%2             AS l1,
CONVERT(CHAR(100), c.connection_id)            AS connection_id,
c.session_id,
c.connect_time,
c.num_reads,
c.num_writes,
c.last_read,
c.last_write,
c.client_net_address,
c.client_tcp_port,
(
SELECT COUNT(*)
FROM   sys.dm_exec_requests r
WHERE  ( r.connection_id = c.connection_id)) AS request_count,
s.login_time,
s.HOST_NAME,
s.program_name,
s.login_name,
s.is_user_process
FROM   sys.dm_exec_connections c
LEFT OUTER JOIN sys.dm_exec_sessions s
ON ( s.session_id = c.session_id)

There are three sections. They show similar information but with different order by clauses.

  • 10 Oldest Connections - order by c.connect_time
  • Top Ten Connections By # Reads - order by c.num_reads desc
  • Top Ten Connections By # Writes - order by c.num_writes desc

Well, that was quite a few reports in one go today. I am sure you will play with them and do let me know if you find anything interesting or used these reports in any interesting ways.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – SSMS: Scheduler Health Report

Have you ever been to a music concert? It is the most humbling experience one can get as a music lover. The live music with hundreds of musicians in one stage brings goosebumps with the very thought. I have always been fascinated and wanted to experience this once in a lifetime and am sure that day is not far off. What strikes me big time is the conductor, standing alone with a small stick orchestrates these fabulous musicians to give all of us a delight and experience of a lifetime. This brings me to today’s topic of discussion on the Scheduler Health Report. In a way the conductor inside SQL Server is our scheduler – who makes sure all the activities and all parts get their share of time to execute. Looks like a dream job, but trust me there is lot of effort in understanding how each components works just like how a conductor really knows when to introduce a new instrument as part of the concert. Before I start explaining about this report component, it’s important to go through the basics of SQL Scheduler which would help in understanding this report.

SQL Server has a mini operating system which manages the resource by its own, that’s why you would hear term called SQLOS. By resources we mean CPU, Memory and IO available on the machine.

Whenever any request is received in SQL Server, it would be assigned to a thread and that thread would be scheduled on a scheduler. This might go to multiple scheduler in case of parallelism though. Those threads which are ready to run would be scheduled and sent to the operating system for execution. Imagine a situation of blocking where a blocked thread can’t do anything unless the resource is available. In such situation, does it make sense to send this request out to the operating system for scheduling? Of course not! That’s why this mini operating system does better scheduling and SQL can scale up very well as workload increases. Another advantage of the SQLOS layer is to reduce context switching of threads in operating system because it only sends the threads which can do some meaningful work.

To summarize, SQLOS is a mini operating system within sqlservr.exe process which takes care of managing CPU, Memory, locks, IO and a lot more. In general terms, the scheduler is a logical CPU on SQL Server side. In general, each scheduler is mapped to one logical processors exposed by the operating system. There are hidden and visible scheduler in SQL Server. They can be looked into via DMV sys.dm_os_schedulers

To know more about our Scheduler, here is the place to get this report. Right Click on Server node, choose Reports > Standard Reports > Scheduler Health.

The complete reports revolves around SQLOS. SQLOS has something called non-preemptive scheduling (also known as cooperative scheduling) which is different from the scheduling done by the operating system. Windows operating system does preemptive scheduling where a thread would get a fixed amount of time to run on the CPU. Once the time slice is completed, it would be snatched out of the CPU and put into the queue for the next chance to run. This is a fair game because all threads are getting a chance to run. On the other hand, in SQLOS a thread would do its work and come back to the scheduler by its own; no one is going to take him off the scheduler. This term is called as yielding. If a thread went out from SQLOS layer and didn’t come back – it’s called as non-yielding situation. If all schedulers have the same problem of non-yielding them you can imagine that SQL would go to “hung” state. A scheduler can be in three states – Idle (when work_queue_count <> 0), Hung (when yield_count is not changing) or Active (when it’s not in the other two states). Hence the first part of our report shows which states our Scheduler currently is in. In our case, the scheduler is in the Idle state.

The second part of the report shows the details about worker (can also be accessed via DMV sys.dm_os_workers), tasks (accessed via DMV sys.dm_os_tasks) and processes running under each scheduler. Let us understand these terms in the little details as it would help you in understanding this section of report better.

Task – represents the work that needs to be performed. It can also be called as unit of work that is scheduled by SQL Server. An example of task could be pre-login, login, query execution, logout and may more. The task can be in various states (PENDING, RUNNABLE, RUNNING, SUSPENDED, DONE or SPINLOOP). Please refer this for more details.

Worker - are the threads who would do the task given by the scheduler.
Request – is the logical representation of a request made from the client application (or work done by system threads) to SQL Server. This request would be assigned to a task that the scheduler hands off to a worker to process.

Now that our fundamentals have been sorted, let us have a look at the second report section:

My machine has currently has 8 logical processors and hence we are seeing values of Scheduler ID’s: 0 to 7. Other schedulers are having status as “HIDDEN ONLINE” in sys.dm_os_schedulers. Each scheduler has various workers associated. We can see that in column #Workers for each scheduler rows. Once we click on (+) for the scheduler, we can see details about each worker. Further clicking on (+) for each worker, we can see the work done by that worker.

I hope that this blog has helped you in understanding the basic functionality of SQLOS and how the Scheduler Report drills into the fine prints.

Reference: Pinal Dave (http://blog.SQLAuthority.com)

SQL SERVER – SSMS: Schema Change History Report

The heat is picking up and I am glad you are liking this series going so far. This particular report is close to my heart and the most recommended. At my recent trip to Delhi and the user group out there, I had the opportunity to meet a number of young DBA’s who were getting into their professional career at various organizations. I always try to persuade such groups with interesting questions to make them inquisitive about learning new concepts with SQL Server.

At this user-group session I wanted people to answer a simple question:

  1. How can I know, who created/dropped/altered the database?
  2. How can I know, who created/dropped/altered the objects?

This caught the attention of this group and I got various answers from DDL Triggers, Auditing, Error Logs, Extended Events and many more innovative response which I will refrain from disclosing his because they were really funny. All these answers were correct in a way and I had to counter them with yet another question to make them thinking.

Though your answers are correct in a way, “what is the easiest / simplest way to find this without writing a single line of code”. Now the twist made the response into something simple. And one attendee had a response stating – “why not use Profiler?”

This response stumped me totally and I said, let me achieve the same with lesser number of clicks for you. And my idea was to show them the use of SQL Server Management Studio – Schema Change History. It has interesting dimensions to examine and let me take a moment to walk them through the same.

Where to start?

The report location can be found from Server node -> Right Click -> Reports -> Standard Reports -> “Schema Changes History”.

One of the important information worth noting here is, the report fetches information from the default trace. We have talked about default trace and how to enable it in our previous post on “Configuration Changes History” report.

If default trace is not enable then this report is smart enough to look at each database and find objects which were created or altered in last 7 days. I was not aware of this until I disable default trace to see the error in the report. To my surprise, the report still came up in some different format. Let us look at the outlook with these options.

With default trace enabled

The report when the default trace enabled is as shown below:

To catch the query that populates this report, I ran Profiler and here is the basic query:

SELECT FROM:: fn_trace_gettabl(e @base_tracefilename, default )
WHERE EventClass in (46,47,164) AND EventSubclass = 0 AND DatabaseID <> 2

To elaborate a bit, the EventClass 46, 47 and 164 corresponds to Object:Created, Object:Deleted and Object:Altered respectively (Refer sys.trace_events from MSDN for more info).

With default trace disabled

Now here is the twist, when Default Trace is disabled, the query which is executed in each database is shown below for reference:

SELECT o.name AS OBJECT_NAME,
o.type_desc,
o.create_date,
s.name AS schema_name
FROM   sys.all_objects o
LEFT OUTER JOIN sys.schemas s
ON ( o.schema_id = s.schema_id)
WHERE  create_date > ( GETDATE() - 7);  

And below is the report generated by the query. It is evident from the last section of “Note” from the report that our default trace is not enabled.

There are two sections in the report. They are based on similar query which I pointed above with a minor difference of “create_date” and “modify_date” column as below.

SELECT o.name AS OBJECT_NAME,
o.type_desc,
o.create_date,
s.name AS schema_name
FROM   sys.all_objects o
LEFT OUTER JOIN sys.schemas s
ON ( o.schema_id = s.schema_id)
WHERE  modify_date > ( GETDATE() - 7);  

The disadvantage of disabling default trace is that we would not be able to see any information if a database was dropped.  I generally have seen this trace to be non-intrusive on most of the systems. But would love to hear from you and learn if you faced any problems with it.

Caveats with Schema Change History Report

One problem in the report is, even if one database is inaccessible, it would give error and fails to report anything for remaining databases. For illustration purposes, I made the database to norecovery state and I refreshed the report to get the below error:

If you ever have such situation, you can run the T-SQL query mentioned above manually on the database under question to view the changes.

Does anyone reading this post, ever disabled your Default Trace ever? Have you used this reports in your environment? Let me know your learnings.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – SSMS: Server Dashboard Report

The term dashboards come from the automobile world where it meant as instrumentation or controls for operating a vehicle. In the IT parlance, the use of dashboards meant as the state or health of system or organizations. Though this analogy cannot be taken too far because running a business is not same as running a vehicle. But the similarities of knowing the health of the system is bang-on.

In our previous post, I gave an introduction on where to find all the SQL Server Management Studio reports. Today we will look at the first and the most basic report all of us must be working with. It is called as “Server Dashboard” report under the Server node, standard reports node.

This report gives and overview of various settings in a SQL Server instance. As the name suggests, it shows the health of SQL Server instance level and has three sections to examine. We will investigate each of these sections next.

Configuration Details

This section is interesting because it can answer some interesting questions:

  • When was the SQL Server started / restarted?
  • What Edition of product are we running and version?
  • If the instance is a Clustered instance or not.
  • Number of logical processors used by the SQL Server instance.

When someone asks me to performance tune a system, I generally tend to ask some of these basic questions to customers. Lesser known to DBA is that instead of querying or relying on their Excel sheets, this report can give this information in an instant.

Most of the other values are self-explanatory. It should be noted that “Scheduler Agent Jobs” shows the number of jobs created, not what is running or enabled.

Non Default Configuration Options

Let us move to the next section. As an administrator, if you want to know the server level settings that were changed from default in a given instance, then this is the section to note.

By default the values are not visible. We need to click on (+) symbol to toggle it.

This part of the report can show various non-default configuration settings done via the executing sp_configure stored procedure. You can read about enabling / disabling sp_configure basics from my previous blog posts.

I always look into this section to see if the administrator or someone else in the system has limited the memory allocated to SQL Server on a given box. This comes under the “Max server memory(MB)” section as shown above.

This section also shows if any trace flag is enabled on the server. This information is normally not available when you just play around with sp_configure.

Activity Details

Next the busier part section with Activity Details come. This has a number of interesting information. A visual graph to show the CPU and IO performed on the instance. But most importantly there are sections on the top as a table which is worth a note.

  1. Shows the number of Active sessions and transactions in the system at the moment.
  2. Active databases available in the instance.
  3. Total server memory in the system.
  4. Logins connected to the server.
  5. Number of blocked transactions in the system
  6. Finally, worth to note is the server side traces running in the system.

Though the information is available in a nice report, the same can be got via various DMVs from SQL Server. I am outlining some of these for reference.

Title DMV and filter
Active session sys.dm_exec_sessions where is_user_process = 1 and status = ‘running’
Active Transactions sys.dm_tran_active_transactions where transaction_state = 2 or( transaction_type = 4 and dtc_state = 1)
Total Server Memory sys.dm_os_performance_counters where counter_name like ‘%Total Server Memory (KB)%’
Idle Sessions sys.dm_exec_sessions where is_user_process = 1 and status = ‘sleeping’
Blocked Transactions sys.dm_tran_locks where request_status = ‘WAIT’
Distinct Connected Logins on Sessions sys.dm_exec_sessions where is_user_process = 1
Traces Running sys.traces where status = 1

The query to generate the graph is shown below. This is based on historical data using sys.dm_exec_query_stats to show the graph for IO and CPU.

Query 1:
SELECT  total_worker_time,
CASE WHEN DB_NAME(dbid) IS NULL THEN 'Adhoc Queries'
ELSE DB_NAME(dbid)
END AS DB_NAME,
dbid
FROM    sys.dm_exec_query_stats s1
CROSS APPLY sys.dm_exec_sql_text(sql_handle) AS s2

Query 2:
SELECT  total_logical_reads + total_logical_writes AS total_io,
CASE WHEN DB_NAME(dbid) IS NULL THEN 'Adhoc Queries'
ELSE DB_NAME(dbid)
END AS DB_NAME
FROM    sys.dm_exec_query_stats s1
CROSS APPLY sys.dm_exec_sql_text(sql_handle) AS s2

Though the reports are a great way to see this data visually, it is nice to see the power of DMVs and how they can be used with SQL Server. All these information are available out-of-box but these are visually represented nicely using the SQL Server Management studio dashboards.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Introduction to SQL Server Management Studio Reports

I get a ton of questions over mail and one of the most theme is always around Performance tuning and troubleshooting. To add to this confusion, even my chat window mostly involve around talking about some performance problem with SQL Server. Let me tell you some of these common themes or questions:

  1. How to improve performance?
  2. How do I know problematic queries?
  3. What is consuming my space in HDD?
  4. How to find out CPU hogging queries?
  5. Who dropped my tables / database?
  6. Who is consuming my memory on this SQL Server?
  7. When was the backups / restores happen on this Database?
  8. I am can see a lot of IO activity on my production server with 50 databases, I don’t know who is causing the IO now?
  9. What is the Index usage statistics?

And this list keeps going on and on. For most of the queries there are my pluralsight courses on performance that you can watch or there are tons of scripts over this blog itself that can help you answer a number of above queries. I will refrain from linking all of these in this one blog but we will look at a different handy utilities available with SQL Server Management Studio out-of-the-box.

In this era of obsessed enterprises, the CxO are always wanting to look at the health of their systems in new and innovative ways via Dashboards and reports. This is sometimes like driving a vehicle, you know the health of the vehicle by watching its dashboard for speed, fuel, rpm or any other warning indicators. So to run a business the executives rely on these reports from their teams. In a similar way, a lot of DBA’s just wish they were empowered with such capabilities for their day-today lives too. Lesser known to them is that since SQL Server 2005, such reports have been added out-of-box to them at their fingertips.

Reports: Where to find them?

There are a number of reports available with SQL Server Management Studio and they can be found at two locations:

  1. On Server Node -> Right Click -> Select Reports -> Standard Reports
  2. On a specific Database -> Right Click -> Select Reports -> Standard Reports

In this example, we will be using a SQL Server 2014 instance. Under the Server node we can see close to 23 different reports.

Under the Database node, we can find close to 18 different reports to work with.

Administrators Reports Center

Now that we have got to know these reports existing on SSMS. It is worthwhile to explore some of them. This series will be to explore these hidden gems and how one can use them effectively in their day-today lives.

All the reports have some use and help administrators solve basic questions like the one asked before. The simplest I can think today is around the questions of database:

  1. What is my database disk usage
  2. Disk usage by Tables, Partitions
  3. What are the transactions blocking and much more.

Recently a Developer caught me at the local SQL Server Bangalore UG and asked if they can modify the same. Interestingly, these reports are not stored as plain vanilla .rdlc files but in some other internal formats. Having said that, it is always possible to create our own custom .rdlc files and use the same with SSMS. In this blog series, we will look at functionality of the majority reports that a DBA must be aware and how we can maximize our time by being productive with these reports. Hope you will enjoy this journey with me.

PS: If you have been using these reports in your environment, can you let me know the reports that you use most often and why?

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL Reporting – How to Add Excel and Word Reporting to Your Own Application – An Innovative Approach

To experience the awesomeness of OfficeWriter, I suggest you here to download the tool and work along with this blog post.

The OfficeWriter API is a .NET library by SoftArtisans that makes it easy for developers to add Excel and Word reporting and document processing to their own applications. OfficeWriter allows users to take data from any data source and turn reports into dynamic, visual presentations. Without requiring Microsoft Office on the server, OfficeWriter is optimized for high-performance, scalable server use. The .NET API integrates with business applications, including those in SSRS and SharePoint.

Why use OfficeWriter’s API with SQL Server Reporting Services (SSRS)? The default rendering extensions for Reporting Services deliver flat, static output. In order to render for multiple extensions such as HTML and PDF, the basic report design in SSRS cannot accommodate specific features, such as multiple worksheets in Excel. This also means that certain features like charts cannot be dynamic in the output because not all of the rendering extensions in SSRS support Excel charts. Therefore, charts are exported as images. Additionally, until SSRS 2008 R2, there wasn’t a default rendering extension for Word. Before SSRS 2012 there was no means of exporting to XLSX or DOCX. Using OfficeWriter for Reporting Services corrects these limitations, allowing end-users to design reports in Excel and Word and make use of many specific features in those applications.

How Creating a Report in SSRS with OfficeWriter Works:

OfficeWriter fits into SSRS with two parts. First, the OfficeWriter Renderer is a server-side rendering extension for Excel and Word, providing Excel and Word features beyond the built-in SSRS export options. Second, the OfficeWriter Designer is a client-side add-in for Excel and Word, which allows users to design reports that utilize the OfficeWriter rendering extensions.

Designing a Report

The RDL file contains information about connecting to data sources, queries, as well as report design information. The OfficeWriter Designer adds its own data and design information to the report, which will be used by the OfficeWriter Renderer to export the report. The Designer sections off a part of the RDL for storing this information separately from the rest of the RDL file. In particular, the Designer saves a binary copy of the Excel or Word template file to the RDL and information about binding the report data to that Excel or Word template file.

Exporting a Report

When the report is exported using the Renderer, Reporting Services retrieves the data from the queries and data source locations stored in the RDL. It passes this data along to whichever renderer was selected for exporting the report. The OfficeWriter Renderer loads the copy of the Excel or Word template file and binds the data from Reporting Services to the template. The Renderer is actually using ExcelTemplate or WordTemplate to bind the data to produce the output.

In the new release of OfficeWriter, the new .NET Designer was introduced. The .NET Designer has better platform support (i.e. 64-bit Office) than the previous designer, moving from dependence on COM and VBA to being written entirely in .NET/C# using VSTO.

At the start of 2014 SoftArtisans also plans to release a major version upgrade; OfficeWriter version 9 is set to be released early next year with PowerPoint capabilities and a calculation engine for ExcelWriter, among other new features to provide more options for building reports through your own business applications.

The Big Difference

With the OfficeWriter renderer on the other hand, the charts will be native Excel charts with live data. Let’s see the difference in the following screenshots: “Product Sales Report.xlsx” generated using the OfficeWriter renderer in SSRS Click here to download a free trial visit.

To learn more about how OfficeWriter’s Excel Export functionality compares to that of SSRS visit here. 

Reference: Pinal Dave (http://blog.sqlauthority.com)