SQL SERVER – Differences in Vulnerability between Oracle and SQL Server

In the IT world, but not among experienced DBAs, there has been a long-standing myth that the Oracle database platform is more stable and more secure than SQL Server from Microsoft. This is due to a variety of reasons; but in my opinion, the main ones are listed below:

A. Microsoft development platforms are generally more error-prone and full of bugs.

This (unfairly) projects the weaknesses of earlier versions of Windows onto its other products such as SQL Server, which is a very stable and secure platform in its own right.

B. Oracle has been around for longer than SQL Server and must therefore be more stable and secure.

Well, this does not count for anything. Being around longer does not mean that you are necessarily wiser. Need more proof? – look at General Motors.

Let us look at the comparisons between Oracle’s DB platform and SQL Server:

Number of reported vulnerabilities for per product

In my opinion, this is the most basic test for stability and security – the number of errors and bugs reported for a product is roughly proportional to its security and stability. Note that this number is usually compiled by independent information-security companies; so, there is no question of “hiding the numbers.”

In this regard, Oracle fares poorly as compared with SQL Server. Oracle Corporation releases an amazingly large number of patches and Critical Patch Updates (CPUs) for its DB platform. To be fair, following are some of the arguments that support Oracle DB (together with answers for those same arguments):

Oracle runs on several platforms, while SQL Server only runs on Windows

Answer: No, the patches and bugs reported are almost all cross-platforms, which implies that they are OS-independent.

Oracle DB also includes several other components, so we are not comparing like with like

Answer: Here, I considered only the database server components. This implies that any problem arising from components such as the Intelligent Agent or the Oracle Application Server has not been included.

Let us compare the Nov 2009 vulnerability reports of the both Oracle11g [1] and SQL Server 2008 [2].

Product Advisories Vulnerabilities
SQL Server 2008 0 0
Oracle11g 7 239

This is not only for the latest DB platforms: Oracle 11g and SQL Server 2008. No, if we take a historical perspective, Microsoft patched 59 vulnerabilities in its SQL Server 7 – 2000 and 2005 databases in the past 6 years, while for the same period Oracle issued 233 patches for software flaws in its Oracle 8, 9 and 10g databases. Moreover, in 2006, Microsoft SQL Server 2000 with Service Pack 4 was ranked as the most secure database in the market together with the PostgreSQL open source project. Oracle10g was placed at the very bottom of the same list.

DBAs are wary and tired of patching the Oracle DB

A survey conducted in January 2008 [3] showed that two-thirds of Oracle DBA’s do not apply security patches. The underlying cause of this is that Oracle Corporation releases a huge number of patches and fixes for various bugs, which itself leads to this secondary problem. There is a lot of fatigue and effort involved in tracking, testing and installing several patch releases every year. In 2009 alone, Oracle released 33 patches for its DB.

However, I am not at all suggesting that Oracle DBAs are lazy or do not take database security seriously. The main reason why many DBAs are very wary of patching Oracle databases is the complexity involved. First, note that patch testing, and also CPU testing is a long and intensive process. Because of the large numbers of bug fixes and CPUs released by Oracle, many application vendors whose products run on an Oracle DB simply do not have the time to test a patch, or as soon as they do so, another one is released. This, in turn, implies that if their clients risk installing unapproved patches, then the vendor can rightfully refuse to support them in case that patch then causes an error in the application.

Slavik Markovich, the Chief Technology Officer of database vendor Sentrigo Inc, said at a conference:  “To apply the CPU, you need to change the binaries of the database. You change the database behavior in some ways that may affect application performance. So applying security patches to a database typically involves testing them against the applications that feed off the database. This is a very long and very hard process to do, especially if you are in enterprises with a large number of databases and applications. Applying these patches means months of labor and sometimes significant downtime, both of which most companies can’t afford.”

Microsoft has a working system of patch testing and rollout, whereas Oracle does not have such a system

Trustworthy Computing is a Microsoft tool that proactively identifies and allows you to install missing patches. When Microsoft launched this initiative, many people did not take it seriously. But now it has proven to be a lifesaver for many busy DBAs and system administrators who simply do not have the time to worry about installing patches. Oracle does NOT have an equivalent tool.

Also, Oracle also does not make life easier for companies who want to keep their databases secure, making it complex to download and install patches. With SQL Server, you can schedule automatic installation of updates and patches. Moreover, if it causes an undesired effect on your application, you can simply uninstall it, leaving the database at it was prior to the update. This is somewhat similar to the System Restore feature of Windows. With Oracle DB, both the installation and removal of patches are complex events that are not easy to do and undo, except for a seasoned DBA.

However, the single most crucial factor in Microsoft’s DB-security-management success is its Security Development Lifecycle (SDL). The use of SDL [4] implies that knowledge obtained after resolving the problems is never lost; instead it is ploughed back into the cycle. Therefore, instead of repeating the same mistakes every time, you can at least ensure that the new code is more secure than the old code, even though it is not completely secure. For instance, the mistakes that were committed and resolved while developing SQL Server 2005 were not repeated during the development of SQL Server 2008. However, there is one issue that bothers developers and DBAs who use Oracle DB: they come across the same mistakes in every version used by them. Eventually, when one problem is resolved, many a time the results are not problem-free and in turn, a new error or problem is created – overall, there is no consistent and reliable problem-solving technique for correcting bugs and fixes. In fact, database consultant Karel Miko estimates that Oracle Corp. is about 5 years behind Microsoft in patch management.

Summary

I hope this article helps to debunk the myth that SQL Server is a less stable and less reliable platform than Oracle DB. As many researchers and security consultancy firms worldwide have pointed out, SQL Server is consistently more secure and much less prone to errors and bugs than Oracle DB.

Sources:

[1] http://secunia.com/advisories/product/18050/?task=statistics_2009

[2] http://secunia.com/advisories/product/21744/?task=statistics_2009

[3]http://www.computerworld.com/s/article/9057226/Update_Two_thirds_of_Oracle_DBAs_don_t_apply_security_patches

[4] http://www.infoworld.com/d/security-central/database-expert-oracle-trails-microsoft-patch-management-166

Reference : Pinal Dave (http://blog.SQLAuthority.com)

SQL SERVER – Future of Business Intelligence

Business Intelligence (BI) is slated to play bigger roles in all kinds of businesses in the coming years. This is not surprising as data analysis and smarter decision making has made the use of BI inevitable in all sizes of businesses across all sectors, including Real estate, IT, mobile devices, governmental agencies, scientific and engineering communities and R&D labs, banking and insurance, to name a few. BI can effectively deal with industry-specific constraints, operations and objectives thereby helping organizations to better understand their customers, optimize their operations, minimize risk, manage revenue, and ultimately improve their results. Moreover, the changing economic environment, which is marked by shrinking budgets, is making way for advancement of successful BI initiatives.

Business Intelligence Forecasting – Predicting Your Way To Success

Today, technology has created one solid platform for the world market, eradicating geographical boundaries. The escalating number of consumers with wide-ranging demands and expectations is making it extremely difficult to conduct business. So meeting customer’s demands has become a top-most priority for all organizations to gain competitive advantage. However, with the implementation of Business Intelligence solutions, companies can stay ahead in the race and keep pace with the market trends and more importantly, meet customers’ expectations. BI Forecasting will help them prepare business strategies while keeping in mind the future events by analyzing the available past data.

Predictive Analytics is the branch of data mining that is used to analyze current and historical data to make predictions about future events. It can help companies optimize their existing processes, well understand customer behavior, spot unexpected opportunities, and anticipate problems before they occur. More and more organizations are realizing the benefits of using data to align their current actions with their future objectives. By incorporating predictive analytics into their daily business operations, they can have better control over the decisions they make every day, which in turn will help them maximize their ROI.

Business Intelligence – Paving Way For Information Democracy

The number of BI users has been constantly growing past the traditional IT “power user”, from dozens, to hundreds and in some cases now thousands of users. Soon this number will be in the millions. Earlier, BI was restricted to only statisticians and corporate analysts and only these selected few could access BI. But today, BI is spreading to nearly every strata of organization, as companies attempt to provide critical data to business users who need it to perform their jobs.

Increasingly, companies are realizing the importance of users having access to timely and relevant insight, which will help in managing performance and aligning them with the mission of the organization. BI is bringing information democracy by providing everyone the insights they require and delivering information to the right people at the right time across the enterprise. More and more organizations are expected to empower their employees with BI for productivity and operational gains.

Software as a Service Business Intelligence –Broadening Usage of BI

A rising number of BI vendors are providing BI software as a service (SaaS) or on-demand business intelligence service. SaaS has been increasingly gaining popularity among small and mid-sized companies. It offers companies a practical option to deploy applications that can provide significant business value. With BI SaaS, companies are not required to make major upfront investments in a hardware server and licenses for the BI software, operating system, web servers and the like. In this case, the software provider hosts them over the Internet for a fee, which can be monthly, quarterly or yearly basis.

The high implementation cost of an end-to-end BI solution was an important factor which discouraged small and mid-sized businesses to adopt BI. With on-demand BI, all sizes of companies can avail the benefits of BI and enhance their business growth. Apart from being cost-effective, it provides several other benefits such as shorter implementation cycles and no maintenance of IT support staff.

Real-time Business Intelligence – Instant Information For Success

Over the last few years BI has been gradually growing in importance and in future, organizations will depend only on real-time information related to their business for smarter decision making. Real-time business intelligence can be defined as the process of delivering information on business operations devoid of any latency. Real-time BI disseminates information about a business in a range from milliseconds to a few seconds after the business event. While traditional business intelligence gives users only historical information, real time business intelligence provides a comparison of present business events with historical events, which helps in identifying a range of issues thereby allowing them to resolve it on time. Moreover, the primary aim of real-time BI is to enable corrective actions to be initiated and business rules to be attuned to optimize business processes.

With consumers demanding faster and improved services from businesses it has become imperative for companies to pay even better attention to consumer satisfaction. They now demand near real-time analysis for intelligent decisions for business success. The rising demand for near-real-time data analysis is making way for a new framework for business intelligence. Business intelligence 2.0 is the recently-coined term to describe the acquisition, provision and analysis of real-time data, which was earlier lacking in BI. BI 2.0 is a part of the constantly developing business intelligence industry and indicates the next step for BI.

BI has truly empowered every businessperson who can now easily use BI to make better decisions and that too on their own, without relying on IT or power analysts to prepare and interpret results for them. In a couple of years BI applications will become as commonplace as spreadsheet applications within all organizations that are midsize or larger. Organizations making effective use of BI technologies will rise and stay far ahead from their competitors. It is expected that BI will soon replace the gut-feel management with real data-based decision-making. Over the coming years, business intelligence will undergo transformation that will have a broad and lasting impact. It will revolutionize the way that we think about business and the way business decisions are made. It’s only when thoughtful analysis supersedes gut feeling and conventional perception, we will enter the next level of business intelligence that will empower businesses with the capacity to reason, prepare, forecast, resolve issues and innovate.

Reference: Pinal Dave (http://blog.sqlauthority.com),DNS

SQL SERVER – Business Intelligence – Aligning Business Metrics

Today, executive management and managers need the latest information to drive intelligent decisions for business success. More informed decisions mean more revenue, less risk, decreased cost, and improved operational control for business agility and competitiveness. Besides, in today’s fast paced, technology-driven business world, organizations are continually struggling to deal with growing data volumes and complexity to use their own data efficiently. Constrained with competitive environments and data complexity are COO, IT Managers and Business Consultants who are asking for less information more easily for smarter, faster decision-making. They want information that is highly visual, up-to-date, personalized and secure. Also, they want information delivered in line with where and how they work.

By leveraging on the power of Business Intelligence (BI) organizations can understand and analyze large volumes of rapidly changing data for effective decision-making. BI helps decision makers to harness the advantage of change to create competitive advantages, achieve corporate objectives, and make better decisions, faster.

Business Intelligence to put it simply, is all about improving decision making within an organization. By presenting the latest information to the right people at the right time the quality of decisions as well as their timeliness can be improved. These days, organizations are embedding BI into business process to create a better workflow, apart from gaining other benefits of BI. In fact, BI can make all areas of business strategic and make them value-rich.

Business Intelligence

Proper BI deployment – Aligning all business metrics

Business Intelligence adds value to all departments or business units within an organization. In general, revenue-generating areas of business such as sales and product manufacturing are often considered critical to a company’s success. While other areas like HR and operations are a requisite to do business but are not considered crucial for revenue generation and for improving business performance. So naturally, when it comes to adding BI value to a business, only revenue-generating departments and areas strike the mind first. However, this is not the case and with appropriate metrics and data this thought process needs to be changed.

Traditionally, Business Intelligence focused primarily on siloed information. Functional areas like Finance, Sales, Marketing and HR created metrics. To provide value to an organization from top to bottom, metrics must align from top to bottom across business units and do away with silos. Undoubtedly, there is still value in maintaining HR, Finance and Sales-specific metrics; however, if the metrics of all areas of an organization are aligned to corporate metrics and objectives, it can prove to be a potent tool to facilitate both the top and bottom lines of an organization. From line leaders to middle managers to executives, they all must have metrics and dashboards that are congruent and support one another.

Nowadays, businesses are putting in a lot of effort to come up with smart, intelligent decisions that will help them run a strategic operation and gain competitive advantage. One of the key aspects of proper Business Intelligence deployment is the alignment of metrics from top to bottom across functional areas, which can go a long way in ensuring business success.

business intelligence graph SQL SERVER   Business Intelligence   Aligning Business Metrics

Scenario: Business Intelligence in real world

Let us take a real world scenario and try to understand how proper alignment of business metrics makes all areas of a business more strategic and value-rich.

In our example, let’s take a car manufacturing company Speed Motor Manufacturing. The top priority corporate objective of the company is to increase its profit margin by 5%. To attain this target, it is important to consider each area of business to help impact the bottom line, including the HR department. In general, HR department is considered important by all organizations; however, it is not considered strategic or a contributor to the bottom line.

Now, let’s assume that the operation’s business unit of Speed Motor Manufacturing has ascertained that the best way they can contribute to the overall margin objective is to make 20% more use of a new type of machine which can speed up assembling car components, which will in turn improve the speed of manufacturing process and more cars will be produced per day. However, this complex machine cannot be operated by all individuals. It requires specialized training and more experienced users. Now, in order to meet the objective, the new car manufacturing machine requires an increased usage by 20% for the year.

Here, HR department can play a major role. The HR department ties its goals and metrics to help operations meet their goals, thus directly contributing to the corporate objective of increasing the profit margin by 5%. To meet the objective, HR department must retain the skills required for operating these complex machines, train individuals to operate the new machines and even hire new skills that can operate the new machines. HR’s metrics map specifically to the operations team’s metrics, which map directly to the overall corporate metrics.

In our example, we saw how metrics are aligned top to bottom and across the organization.

Having said that, in order to successfully measure these metrics and create dashboards to check advancement and achievement, the data and the data structure must be available and appropriate. If the data structure is present and the objectives are properly aligned, all aspects of an organization can be strategic and contribute to enhance the business performance for assured success. The HR business unit measures the training it conducted, which increased the number of qualified operators of the new machineries. The operations leader measures the number of hours the new machine was operated compared to the previous year. The finance team measures the profitability of the operations in the current year compared to last year and quantifies operations increase in margin. Thus, we see how all the departments work together towards a common objective, set up metrics to monitor and measure their success and link them accordingly.

In a nutshell, proper deployment of Business Intelligence makes way for perfect objective alignment and improved vision throughout the organization for business success and competitive advantage.

Reference : Pinal Dave (http://blog.SQLAuthority.com), DNS

SQL Server – White Paper – An Introduction to Fast Track Data Warehouse Architectures by Erik Veerman

An Introduction to Fast Track Data Warehouse Architectures
SQL Server Technical Article
Writer: Erik Veerman, Solid Quality Mentors
Technical Reviewer: Mark Theissen, Scotty Moran, Val Fontama
Published: February 2009

The performance and stability of any application solution—whether line of business, transactional, or business intelligence (BI)—hinges on the integration between solution design and hardware platform. Choosing the appropriate solution architecture—especially for BI solutions—requires balancing the application’s intended purpose and expected use with the hardware platform’s components. Poor planning, bad design, and misconfigured or improperly sized hardware often lead to ongoing, unnecessary spending and, even worse, unsuccessful projects.

The ultimate goal of the Fast Track reference configurations is to take the guesswork out of hardware and architectural decisions for the database layer of Microsoft SQL Server-based BI solutions. To help you get started with the Fast Track reference architectures, understand their approach and value, and use them to implement the best possible solution, this paper includes:

  • An overview of the new Fast Track reference architectures
  • A review of BI fundamentals and applicable hardware considerations
  • The tested Fast Track reference architecture components and options
  • Prescriptive guidance for designing and optimizing a solution
  • Resources available to help you choose or create a new hardware configuration

fasttrackdw SQL Server   White Paper   An Introduction to Fast Track Data Warehouse Architectures by Erik Veerman

The Fast Track Data Warehouse reference configurations focus on the central database component of a BI solution, commonly called the data warehouse or data mart. These Fast Track reference configurations target solutions on a single server estimated at up to 32TB of data.

Read An Introduction to Fast Track Data Warehouse Architectures

Abstract courtesy : Microsoft

Reference: Pinal Dave (http://blog.SQLAuthority.com)

SQL SERVER – 2008 Star Join Query Optimization

Business Intelligence (BI) plays a significant role in businesses nowadays. Moreover, the databases that deal with the queries related to BI are presently facing an increase in workload. At present, when queries are sent to very large databases, millions of rows are returned. Also the users have to go through extended query response times when joining multiple tables are involved with such queries. ‘Star Join Query Optimization’ is a new feature of SQL Server 2008 Enterprise Edition. This mechanism uses bitmap filtering for improving the performance of some types of queries by the effective retrieval of rows from fact tables.

Improved Query Response Times
In general, data warehouses employ dimensionally modeled star or snowflake schemas. These schemas have one or more than one fact tables that contain transactional data and many dimension tables, which holds information such as product data, customer information, and times and dates – all these define the fact table data. Usually, foreign keys are employed for maintaining relationships between the rows in fact tables and also between the rows in the dimension tables. Databases that contain star schemas are recognized by SQL Server 2008 Enterprise. It uses the new Star Join Query logic for processing queries against such star schemas more efficiently. Typically, on an average, data warehouse queries run faster to approximately 20 percent.

Automatically Implemented
Star Join Query Optimization is automatically implemented by the SQL Server. It does not require a special database or application configuration. The query processor will usually optimize queries with medium selectivity (this refers to the queries that retrieve approximately 10% to 75% of rows from a fact table). Such queries are usually handled using hash joins to join the dimension and fact tables by employing the foreign keys to identify the matching rows. A hash table is built for each dimension table referenced in the query in the case of hash joins; the optimization process uses these hash tables for deriving bitmap filters. The key values from each dimension table are identified by bitmap filters; these key values qualify for inclusion in the query. When the fact table is scanned, the bitmap filters are applied to it. These bitmap filters eliminate those rows of the fact table which are not qualified for inclusion in the result set. The most selective bitmap filter is applied first as it is found to eliminate the highest number of rows. Since the eliminated rows do not need further processing, the subsequent filters need not be applied to them – this way the process becomes more efficient.

Query Selectivity
The performance is enhanced in the case of medium selectivity queries while using bitmap filtering because the rows are filtered before any joins are implemented. Hence, there is a decrease in the number of rows that are processed by each join. Bitmap filtering is not applied when queries are highly selective (i.e., those queries that return less than 10% of the rows in a fact table). In such case, a nested loop join is found to be generally more efficient. Similarly, when the queries are not very selective at all (queries which return more than 75% of the rows in a fact table), bitmap filtering is not applied as there are very few rows to be filtered, and hence, there is no requirement of enhancement in performance in this case.

Integer Data Types
Star join optimization is found to give the highest efficiency when the data type of the columns used in the joins is integer. This feature enables the bitmap filter to be applied as part of the initial table or index scan rather than being used at a later stage in the query plan. Most of the queries are benefited from star join optimization since foreign key relationships are commonly implemented using integer-based alternate key columns.

Reference : Pinal Dave (http://blog.SQLAuthority.com)

SQLAuthority News – SQL Server 2008 Analysis Services Performance Guide

Because Microsoft SQL Server Analysis Services query and processing performance tuning is a fairly broad subject, this white paper organizes performance tuning techniques into the following three segments.

Enhancing Query Performance – Query performance directly impacts the quality of the end user experience. As such, it is the primary benchmark used to evaluate the success of an online analytical processing (OLAP) implementation. Analysis Services provides a variety of mechanisms to accelerate query performance, including aggregations, caching, and indexed data retrieval. In addition, you can improve query performance by optimizing the design of your dimension attributes, cubes, and Multidimensional Expressions (MDX) queries.

Enhancing Processing Performance – Processing is the operation that refreshes data in an Analysis Services database. The faster the processing performance, the sooner users can access refreshed data. Analysis Services provides a variety of mechanisms that you can use to influence processing performance, including efficient dimension design, effective aggregations, partitions, and an economical processing strategy (for example, incremental vs. full refresh vs. proactive caching).

Tuning Server Resources – There are several engine settings that can be tuned that affect both querying and processing performance.

Download SQL Server 2008 Analysis Services Performance Guide

Abstract courtesy : Microsoft

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQLAuthority News – Best Practices for Integration Services Configurations

Best Practices for Integration Services Configurations
by Jamie Thomson

This article explains what SQL Server Integration Services configurations are used for, why you should use Integration Services configurations, and what options you have for leveraging configurations. It will also make some simple recommendations that are based on my experiences of building Integration Services packages in a real-world environment. An understanding of the terms “package”, “Business Intelligence Development Studio”, and “dtexec.exe” in the context of Integration Services is assumed.

There five basic types of Integration Services configurations.

  • XML Configuration File
  • Environment Variable Configuration
  • Parent Package Configuration
  • Registry Configuration
  • SQL Server Configuration

Read Best Practices for Integration Services Configurations

Abstract courtesy : Microsoft

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQLAuthority News – PASS Sessions – I will be there!

As PASS is now one week away and I am all excited for the same. I am going to attend following two sessions for sure. I encourage all of you to also visit the same sessions. We can all talk about SQL , SQL Integration as well Beyond Relations.

First sessions I will be attending of Rushabh Mehta, he is Managing Director of Solid Quality India.

RushabhSmall SQLAuthority News   PASS Sessions    I will be there!

Overcoming SSIS Deployment and Configuration Challenges

Presenter:
Rushabh Mehta (Solid Quality Learning)

Session Details

It is no secret that a main deficiency of SSIS is deployment. Have you wanted to punch a wall before when trying to move a package from one server to another? You’re not alone. This session will be more than just SSIS deployment anger management. We’ll look at how to overcome the deployment challenges by considering configurations, package and connection security, and deployment approaches. And don’t miss the decision workflows that will help you down the right path on choosing the right storage and deployment approaches depending on your situation!

Second sessions I am going to attend of Jacob Sebastian, he is founder of BeyoundRelational.com.

jacobsmall SQLAuthority News   PASS Sessions    I will be there!SQL Server 2008 – Create powerful XML Schema collections to validate your XML documents

Presenter:
Jacob Sebastian (beyondrelational.com)

Session Details

XML is getting more widely used both inside and outside SQL Server. Stored procedures and functions that accept XML parameters or generate XML documents are part of almost every database project these days. When information is exchanged in XML format, there needs to be an agreement between the sender and receiver about the structure and content of the XML document. Using an XSD (XML Schema Definition) Schema, you can validate the structure and content of an XML document. SQL Server introduced XML Schema Collections in SQL Server 2005 and added a number of enhancements in SQL Server 2008. An XML data type variable or column that is bound to an XML Schema collection is called TYPED XML. When a TYPED XML column or variable is modified, SQL Server will validate perform the validations defined in the SCHEMA Collection and the operation will succeed only if the validation succeeds. In this session we will discuss SQL Server XML Schema collections and see how to create powerful XML Schema Collections to validate XML documents with diverse structure and content.

I am also preparing my schedule as I want to attend almost all the sessions as I can. Please read following related post about SQL PASS.

Related Articles:
Why I am Going to Attend PASS Summit Unite 2009- Seattle

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Introduction to Business Intelligence – Important Terms & Definitions

What is Business Intelligence

Business intelligence (BI) is a broad category of application programs and technologies for gathering, storing, analyzing, and providing access to data from various data sources, thus providing enterprise users with reliable and timely information and analysis for improved decision making. To put it simply, BI is an umbrella term that refers to an assortment of software applications for analyzing an organization’s raw data for intelligent decision making for business success. BI as a discipline includes a number of related activities, including decision support, data mining, online analytical processing (OLAP), querying and reporting, statistical analysis and forecasting.

Business Intelligence for Success

Business intelligence equips enterprises to gain business advantage from data. Once an organization is powered with BI it can anticipate enhanced turnaround time on data collection, come up with fresh ideas for novel business initiatives, foresee accurate picture of customer needs and demands, and perform more targeted marketing campaigns. In addition, it will gain enhanced knowledge that will help it advance its brand into the top slot in terms of market share, cut down its client churn, reduce its overheads and also diminish delays in supply chain, among other advantages. Decisions purely based on the gut feeling cannot assure success; but in BI’s fact-based decision-making framework, confident decisions can be made for assured business success. Further, BI makes an organization agile thereby giving it a competitive edge in today’s evolving market condition.

Business Intelligence Scenario

Every day, a dairy company collects from 2,000 stores information which is uploaded to servers at its headquarters at the end of the day. This information is used by the chain’s main office to instantly analyze key operational measures such as the number of dairy products sold, profits, trends, and so forth.

Next day, by early morning the company’s managers receive performance data. Next, they verify current revenue, time required to perform each job, and other performance measures. With BI, franchisees with multiple locations can have consolidated views, as can the company’s regional managers.

This scenario clearly explains how implementation of Business intelligence can be very fruitful for an organization. BI can catalyze a business’s success in terms of

Revenues

  • Distinguish the products and services that drive revenues.
  • Rank customers and locations based on profitability.

Customer relationship management

  • Categorize low-value customers and work toward improving their value.
  • Find customer relationship issues early and take timely action to resolve them.

Sales and marketing

  • Aim at high-value customers to minimize marketing risk.
  • Rank the success of product promotions based on product and market segment.
  • Find what is in the sales pipeline

Business Intelligence Terms

Ad Hoc Query – Ability to create an immediate ‘request for information’ using an analytical reporting tool to answer a specific business-related query.

Aggregation – Predefined summaries based on comprehensive data in the data warehouse.

Atomic data – Topmost level of detail in the data warehouse.

Attribute – Field represented by a column in reports, tables and charts.

Cube – A powerful analysis tool that is used for viewing and analyzing data in a multi-dimensional format, from any angle, in any combination, through a ‘slice and dice’ and drilldown approach.

Data – Factual information for analysis.

Data Mart – A logical subset of related information, typically around one or a few business processes, or a particular subject area.

Data Warehouse – A repository of well-organized corporate data for Business Analysis and Reporting. It is also a collection of data marts.

Database – A collection of data arranged for convenient and quick search and retrieval.

OLAP – Online Analytical Processing is a category of database software that provides an interface to help users quickly and interactively scrutinize the results in a variety of dimensions of the data.

Balanced Scorecard – A Performance management tool that recapitulates an organization’s performance from several standpoints on a single page.

Dashboard – A reporting tool that presents key indicators on a single screen, which includes measurements, metrics, and scorecards.

Data Cube – A multidimensional structure (typically over three dimensions) that forms the basis for analysis applications, allowing various types of calculations and aggregations.

Data Mapping – Process of spotting a source data element for all data elements in the target data warehouse environment.

Data Quality – It is related to facets such as accessibility, comprehensiveness, exactness, steadiness, significance and timeliness of data.

Data staging – A system where operations like data extraction, transformation and loading operations are executed.

Dimension Table – It contains attributes that describe fact records in the fact table. For instance, pupil enrollment is a central fact table with several dimensions linked to it such as Faculty, Term, and Degree.

Drill Down – A feature that allows users to click on an item to navigate through hierarchy and move to a level of better detail.

EIS – Enterprise Information System – A category of applications and technologies for presenting and analyzing corporate and external data for various management purposes.

ETL – Extract, Transform and Load -It is a three-step process at the center of data warehousing. In the first step, operational data is extracted from multiple sources. In the next step, it is transformed into the required type. In the final step, data is loaded into the data warehouse.

Forecasting – It is the formulation of trends, predictive models, and scenarios to prepare for the future for improved decision making.

Gap analysis – Study of whether the available business data supports business requirements, thus answering questions related to data accessibility, amount of data, missing data, and legacy systems. It examines information and decides on the resources and efforts required to satisfy requirements.

Granularity – Level of detail or summarization of data in the data warehouse. The more the detail, the higher the level of granularity.

Institutional performance management – The process of basing an organization’s actions and decisions on actual measured results of performance. It incorporates performance measures, benchmarks, and goals to attain best results.

Measures – A quantifiable, specific piece of information.

Metrics – Measures of performance that observe progress and evaluate trends within an organization.

Normalization – Standard database design technique for relational databases.

Slice and Dice – Multidimensional tools allow users to view data from any angle. The ability to select various angles to view data from is called slice and dice capability.

Snapshot – View of data at a particular moment.

Reference: Pinal Dave (http://blog.SQLAuthority.com), DNS

SQLAuthority News – SQL Server Accelerator for Business Intelligence (BI)

I have wonderful experience at my recent Business Intelligence tour. I will write down in detail about my experience at different location. However, today I would like to talk about one particular question which was asked at all the locations. It was about SQL Server Accelerator for Business Intelligence (BI). Many attendee asked me how to use this tool.

SQL Server Accelerator for Business Intelligence (BI) is no more supported by Microsoft. Microsoft does not provide any support for this solution accelerator and has no plans to release future versions. Microsoft SQL Server 2005 and later versions include most of the functionality of the accelerator and are supported according to the Microsoft Product Lifecycle.

For the same reason, this tool is not included in the course as well new developer should not focus learning the same.

Reference : Pinal Dave (http://blog.sqlauthority.com)