SQL SERVER – SQL Report Builder in dbForge Studio for SQL Server

Modern opportunities of electronic document management systems can significantly simplify the process of decision-making at an enterprise.

Along with the opportunities they provide, the traditional printed documentation still plays an important role by allowing to conveniently operate the required information by means of reports.

Creation of such reports, depending on the degree of complexity, can take a long time, therefore, to accelerate their creation, visual editors are traditionally used.

Some of these editors have only basic functionality. Others have extended reporting capabilities. The latter group of products includes the SQL report builder built-in to dbForge Studio for SQL Server.

The editor’s capabilities allow you to build a report according your SQL query, literally, within a few minutes. For this, it’s enough to call the Report Designer command from the main menu.

Select a report type in the opened dialog.

Select a connection and a data source.

Herewith, tables and views can be selected as a source.

In case if data, for example, from several tables should be returned by a query, a user query can be written.

After selecting a data source, specify the columns, by which the report will be built.

You should also specify how this data will be presented in the report.

At the final stage, it remains just to specify the name for the report and click the Finish button.

After the wizard is finished, the report editor opens, in which it is possible to fine-tune the appearance of the report.

To preview the report, there is the Preview button provided.

After the report is generated, it can be both printed and exported to one of the supported formats.

As you can see, creating and editing the report in SQL report builder does not take a lot of time, allowing to streamline the process and reduce development time.

Reference: Pinal Dave (http://blog.sqlauthority.com)

About these ads

SQL SERVER – How to Synchronize Data Across Databases and Servers

If you are developer you must have faced situations which I am going to mention next.

A Developer’s Wish

Have you ever built a database application where you are driving your navigation with the help of Data?

Have you ever realized that you are running query much faster on your local server, but when you move them to your test/staging server they are suddenly running slow?

Have you ever wished that you can sync your local machine with the master database so you can run your queries in your near real time scenario so you do not have surprise when you go live on production?

The matter of the fact is that every developer faces such situation when they wish that they can directly work on the live data instead of old historical data on their local machine. If an organization care about the experience of the developer and their product, they will always thrive to provide a tool which makes their developer’s life very easy.

Wish Granted

If you ask me what is a good tool which can help you to synchronize data across multiple databases and servers, I will suggest you to take a look at the DB Change Manager. It is a very powerful tool which can do quite a number of different tasks. Here are a few of your daily chores which you can rely on DB Change Manager to comply.

  • Schema Archive and Compare
  • Synchronization and Roll-back
  • Schema Auto-Synchronization
  • Data Compare and Synchronization
  • Data Masking

Click, Click and Go!

If you are a developer and given a task to make a tool to synchronize data across two different databases, how difficult it would be creating such a tool. Honestly, very difficult as we need to take care of so many different things besides data, for example – schema, configuration, security and what not. However, when we are using the Embarcadero DB Change Manager all the problems are taken care of by the tool.

If you have followed my earlier blog post about Synchronize Schema across Multiple Environment – How to Overcome Database Migration Challenge. All the steps are very similar to that demonstration. The only difference between Schema Compare and Data Compare is the very first step when we start the job. Here is the image where I demonstrate how we can start the Data Comparison Job.

Once you start the Data Comparison Job, it will take you to a very simple wizard with few steps. Each step is extremely easy to follow and most of the time, you just have to accept everything as a default and click on go.

Once you follow the few quick steps of the wizard, you will land upon the final report page where you will notice a grid is placed with details related your change data. You can export the difference or synchronize the data across your server.

That’s it!

I wish I can write more about this entire process, but as I mentioned earlier, this tool is extremely simple to use and follow. Most of the time, I just do Click, Click and Go and I am done!

You can download Embarcadero DB Change Manager from here. I am eagerly waiting for your feedback about this product.

Reference: Pinal Dave (http://blog.sqlauthority.com)

MySQL – Profiler : A Simple and Convenient Tool for Profiling SQL Queries

In terms of working with highly loaded databases, any badly-designed request can cause significant performance degradation. Depending on the used DBMS features, various methods are used for query optimization.

However, a general rule can be pointed out for all types of DBMS: the fewer server resources are expended when executing a query, the more effective will be the query itself.

To detect bottlenecks, MySQL provides the EXPLAIN command. It will help tell you what exactly the database server does, when you execute the query.

To obtain data on executing a query, it is enough to set EXPLAIN before each targeted query. However, putting EXPLAIN for a large number of constructs can become rather tedious.

To simplify search for a problem query let us use MySQL Profiler in dbForge Studio for MySQL. It provides such commands as a Query Profiling Mode and Generate Execution Plan.

To demonstrate work with execution plan, let’s consider a small sample. Suppose we need to find vendors of PCs, but not laptops, basing on the following data:

CREATE TABLE models (
model_id INT(11) NOT NULL PRIMARY KEY,
model_type VARCHAR(10) NOT NULL
)
ENGINE = INNODB;
CREATE TABLE products (
maker_id INT(11) NOT NULL,
model_id INT(11) NOT NULL,
PRIMARY KEY (maker_id, model_id),
CONSTRAINT FK_products_models_model_id
FOREIGN KEY (model_id)
REFERENCES models (model_id)
ON DELETE CASCADE ON UPDATE CASCADE
) ENGINE = INNODB;
INSERT INTO models(model_id, model_type)
VALUES
(1, 'PC'), (2, 'Laptop'), (3, 'Tablet'),
(
4, 'Phone'), (5, 'PC'), (6, 'Laptop');
INSERT INTO products(maker_id, model_id)
VALUES
(1, 1), (1, 2), (1, 3), (2, 4),
(
4, 4), (2, 5), (3, 5), (3, 6);

This task can be solved quite trivial:

SELECT DISTINCT p.maker_id
FROM products AS p
JOIN models m ON p.model_id = m.model_id
WHERE m.model_type = 'PC'
AND NOT EXISTS (
SELECT p2.maker_id
FROM products AS p2
JOIN models m2 ON p2.model_id = m2.model_id
WHERE m2.model_type = 'Laptop'
AND p2.maker_id = p.maker_id
);

When designing a query, one always  ought to pay attention to re-readings from the same tables, since they bear an additional load on the database server disk subsystem:

The query can be rewritten with no re-readings, using aggregating functions:

SELECT p.maker_id
FROM products AS p
JOIN models AS m ON p.model_id = m.model_id
WHERE m.model_type IN ('PC', 'Laptop')
GROUP BY p.maker_id
HAVING COUNT(CASE WHEN m.model_type = 'PC' THEN 1 END) > 0
AND COUNT(CASE WHEN m.model_type = 'Laptop' THEN 1 END) = 0;

However, having a large number of records, JOIN to the models table can still affect performance.

To speed up the query, denormalization of the scheme can be performed by duplicating data from the model_type column of the products table:

UPDATE products AS p
JOIN models AS m ON p.model_id = m.model_id
SET p.model_type = m.model_type;

Also, pay attention to the fact that the model_type column is filtered by, so it is recommended to index it.

At that,  selection of columns, that will be included to the primary key,  should be approached carefully. Since columns from the primary key are included to the nonclustered index, this can increase database size.

As a result, selection is significantly simplified:

SELECT maker_id
FROM products
GROUP BY maker_id
HAVING COUNT(CASE WHEN model_type = 'PC' THEN 1 END) > 0
AND COUNT(CASE WHEN model_type = 'Laptop' THEN 1 END) = 0;

At this, the plan becomes simple at all:

However, the root of performance problems should not always be searched for in queries – it is very important to correctly configure database server settings. For this, the Server Variables tab can be used.

In this case, all the settings, for convenience, can be grouped according to their target facilities:

There is no perfect advice how to optimize a query.  However, using the profiler embedded into dbForge Studio for MySQL, there is a probability to detect all the performance bottlenecks.

Optimal settings for MySQL server can also vary depending on the specificity of performed tasks. Therefore, detailed consideration of the server configuration process will be revealed in a separate article.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Synchronize Schema across Multiple Environment – How to Overcome Database Migration Challenge

Let us begin today’s blog post with interested story, which we all database developers go through once in a life time.

Story of Database Developer

This about a scenario – you have been out of town on vacation for one week and now you are just back to work. You learn that there is a major release schedule next week of the project which you have been working for quite a long time. There are two big challenges are presented in front of you in this kind of scenario.

  1. You want to catch up on the changes made by your team in the project you have been working while you have been gone.
  2. You want to prepare a deployment script for the staging server (or near “real” server, where you deploy your script to test before deploying for live server).

Now when you think about these two tasks, they look relatively simple task where you just have to collect few scripts together, but if you have ever faced this situation in your real life, trust me this is not an easy task. This task will never end if your team is large and you have to catch up on everything you missed while you were away. Now preparing the

My Personal Experience

I have personally faced this situation quite a lot. My earlier job involved traveling and I was on the road quite a few times. Every time I returned, there was a change in the code or new release and I had to refresh my database server to catch up to master server’s status. It is easy when we have to sync schema changes from master server to local server. The same tasks get difficult when we want to move the changes which we have done on our local system for staging or production server. If we miss a single script the entire deployment script could fail. For example, if you have written to alter script for any table, but there was another alter script from your co-worker which executes before you change the name of the table, your script will fail. When any script fails on a production server – there is nothing but chaos. The entire release team is under pressure to resolve the issue as it is crucial to fix the issue as soon as possible in real time.

What a Smart DBA would do?

Well, everybody can do the labor, but we DBA’s are not paid for our labor work, we are paid for our smartness. We can sit and do the manual task, but what is the fun in it. I rather prefer to do something creative while I have free time instead of just doing the task which I do not like. Synchronizing Schema across the environment is no fun job. I find it is very tedious and boring. Thankfully, there are a few tools available which can help us out in this kind of scenario.

Everytime I face any issue, I immediately Google it. From one of the similar experience, I found Embarcadero’s DB Change Manager and I quickly downloaded. It is a very powerful product and I quickly configured it. With the help of this tool, I was able to quickly detect and deploy changes from one system to another system. You can download the Embarcadero DB Change Manager and try the experiment with me as well.

I strongly encourage all of you to see the video associated with this blog post to see how Embarcadero’s change manager performs schema compare and create jobs which automatically synchronizes the database. You can download the Embarcadero DB Change Manager and try comparing schema as described in the video.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Back& Reborn as Premier Back End SaaS Provider

 Back&, an international provider of Back Office Services, is proud to announce the launch of their revolutionary Zero Entry pricing model and Back End SaaS product. By creating a Better BackOffice, Back& has destroyed the barriers of cost and time – enabling companies and small business of any size access to first class Back End services.

If you are using SQL Server or MySQL in your premise or in the cloud from Amazon or Azure you can connect your own database. Back& services include automatic advanced data entry to your business entities, advanced search and filters. You can easily create charts, Manages your stored procedures with data driven triggering, manage your media content that you can store on premise or also in the cloud, in Amazon S3 or in Azure storage.

With decades of collective programming experience, Back& was formed to enable any company access to robust, intuitive Back Office solutions often only seen in large corporations. From 2012 Back& was bootstrapped to build on that Vision, culminating in their exciting Back& platform which enabled Database driven Back Office functionality through the cloud.

Having secured seed funding from a well-known Entrepreneur and Inventor of the disk on key/ USB stick technology Dov Moran, the company is now proudly able to deliver on their Vision, with their revolutionary ‘clear pricing’ model with a 60 day free trial period. With their 3 tiered pricing plan ranging from Starter to the complete Pro package, Back& truly enables businesses of any size to deploy state of the art cloud powered Back End services.

“Back End Development is for many companies an expensive, time consuming endeavor. With Back& it doesn’t have to be that way, allowing business to spend more time doing what they do best: Building and deploying fantastic services” Gal Frenkel, Back& CEO.Back& enables mobile and web app companies to focus on their business objectives by removing the need for expensive, time consuming in the house Back End Development. Back & quickly and automatically analyses database schemes and content characteristics of any application, generating a featuring rich, customized and secure Back Office. With its Plug and Play, Drag and Drop features and Intuitive Interface, this revolutionary new platform is the complete Back Office solution every business deserves.

Business can now sign up for a free 60 day trial with full functionality at http://www.backand.com/pricing

If you are using SQL Server or MySQL in your premise or in the cloud from Amazon or Azure you can connect your own database. Back& services include automatic advanced data entry to your business entities, advanced search and filters. You can easily create charts, Manages your stored procedures with data driven triggering, manage your media content that you can store on premise or also in the cloud, in Amazon S3 or in Azure storage.

About Back&http://www.backand.com

Founded in 2011, Back& is the premier provider of cloud powered, Back End SaaS services. With over a 10,000 consoles and counting created for a range of international companies, Back& empowers its customers with powerful plug and play Back End functionality. With its flexible UI, powerful data entry and workflow capabilities and quick integration with third party services Back& is the premier Back End SaaS platform.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – How to Survive on Free Tools for Month – ApexSQL Complete – A Free SQL Advanced Intellisense

Note: This blog is based on ApexSQL Complete is a SQL intellisense, download it and play along with is this blog post.

ApexSQL Complete is a free add-in for auto-completing SQL code which integrates with SQL Server Management Studio  and Visual Studio.

The auto-complete feature recognizes, and predicates SQL keywords as well as objects, schemas, databases, parameter and variable names. To auto-complete SELECT statement in a single click just type the SELECT statement into the query window and check the column in the hint list. The rest of the statement will be auto-completed.

By highlighting the object in the hint list you’ll get the complete DDL statement of the selected object even if it is encrypted.

Using the Insert full INSERT statement option you can get the full INSERT statement for a table, without being bothered to remember the order of values you need to pass and the need to write out all the columns. Simply type the INSERT statement in the query window and select the wanted table from the hint list.

The Insert Procedure Parameters Automatically options automatically to display input parameters of the stored procedure in the correct order, and to schema qualify object names automatically use the Insert Object Owner option.

ApexSQL Complete also has user friendly Snippets feature with various variables. I’ll show the snippets feature with a simple example using the $PROMPT$ variable, and make another way to complete statements in the query window.

Create a snippet or edit the existing one in the snippets dialog.

From the query window, right-click and choose the ApexSQL Complete Insert Snippet command. Select the snippet from the hint list, and in this case, because we’re using the $PROMPT$ variable it will prompt the Column Dialog we created.

Enter the variable name, in this simple example the column name and click Ok. The rest of the statement will be inserted and you will be prompted with the hint list for choosing the table to complete the FROM clause

ApexSQL Complete is a SQL intellisense that has more useful options and commands such as:

  • Choose the Error list feature to get a list of all syntax errors, updated in real-time
  • Use Hints feature to expand or narrow the auto-complete hint list
  • With the Code structure feature to display the internal structure of your code
  • Create your own aliases or use auto created aliases. Custom table alias will replace the auto created one and it’ll be listed in the hint list
  • Add alias or object name to the columns in the WHERE clause, even if there is only one object is specified in the FROM clause, and many more

If you’re looking for a tool that can make your coding process faster and easier and within your budget, try out this free SQL assistant by ApexSQL

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL Contest – USD 100 Gift Card and Attractive Discount from Devart

Quick Update: To take part in contest click here.

No matter how old we get we all want to Santa to give us gifts during the festival season. As the festival seasons has been approaching I was talking to a few of the Developer Tools vendors to see if they can come up with something cool for blog readers. If you are familiar with this blog, you must know about Devart. They are a fantastic team of people and make beautiful and easy to use tools.

Here are few of the blog posts, which I have written about them:

If you look back, I have used Devart tools for over 2 years now. I am a big fan of their DBForge Studio for SQL Server. It is like a one mega pack which contains multiple product which DBA and Developer will need for their daily needs. It is a complete set of database development tools for SQL Server united into a single IDE to help developers and administrators increase productivity and tackle routine database development tasks.

The main dbForge products for SQL Server are:

  • dbForge SQL Complete. It type T-SQL queries faster and format code automatically in Microsoft SQL Server Management Studio and Visual Studio to make your work more effective and productive with the help of this SQL formatter tool.
  • dbForge Schema Compare for SQL Server. It compare and synchronize schemas safely, analyze database structure differences, and propagate changes to the required SQL Server database with the help of this tool.
  • dbForge Data Compare for SQL Server. It compare and synchronize data safely and effectively, get a convenient way of managing data differences in a well-designed interface and ability to customize synchronization with the help of this tool.
  • dbForge Query Builder for SQL Server. It use this tool to create complex queries without having to type the entire code. Query Builder provides a wide range of options for visual query results management.
  • dbForge Fusion for SQL Server.  It is SQL GUI tools for Microsoft Visual Studio to advance all SQL database development and administration tasks in your favorite IDE.
  • And certainly – dbForge Studio for SQL Server –  which incorporates most of the functions of the tools mentioned above.

Christmas Offer

Team Devart has come up with fantastic Christmas offer which will celebrate Christmas everyday from today to January 15th, 2013. You can get anywhere between 20% to 35% discount on their product by taking part in the contest which they are running on G+. The contest is very simple and there will be one winner every day that means there is a great chance of winning for everybody everyday. Here is something I am going to do. Once you take part in Devart’s contest, come here and let me know in a comment that you have purchased the product and I will enter you in my contest where you can win USD 100 gift card from ThinkGeek. If you are familiar with ThinkGeek, you want those interesting gadgets but you do not want to spend your money on it. Well, now you have the opportunity to get them. I am going to give one of you USD 100 Gift Card for participating in the contest of DevArt. I will announce the winner on January 16th, 2013.

Click here to know about how to take part in the contest of the Devart Christmas Offer.

Reference: Pinal Dave (http://blog.sqlauthority.com)