SQL SERVER – Difference Between Login and User – SQL in Sixty Seconds #070

“What is the difference between Login and User in SQL Server?”

This is a very common question I often receive. Yesterday when I was browsing Facebook, I once again noticed this question once again asked in SQLBangalore group. My very good friends – Vinod Kumar and Balmukund Lakhani had already answered the question there. However, I every time, I read this question, I realize that not everyone knows the basics of these two concepts. If I have to explain the difference between them, it may take a long time, but I will try to explain it at a very basic level.

Basic Explanation:

SQL Server Login is for Authentication and SQL Server User is for Authorization. Authentication can decide if we have permissions to access the server or not and Authorization decides what are different operations we can do in a database. Login are created at the SQL Server instance level and User is created at SQL Server database level. We can have multiple user from different database connected to a single login to a server.

I hope this is clear enough. If not, I strongly suggest you watch following quick video where I explain this concept in extremely simple words.

Action Item

Here are the blog posts I have previously written on the subject of security. You can read it over here:

You can subscribe to my YouTube Channel for frequent updates.

Reference: Pinal Dave (http://blog.sqlauthority.com)

About these ads

SQL SERVER – What is Biml and How Can it Help Me with SQL Server BI? – Notes from the Field #036

[Note from Pinal]: This is a 36th episode of Notes from the Field series. One of the common question I receive every other day is – I have learned BI but I feel that there is no automation in the BI field, what should I do? How do I go beyond learning BI? How can I fill the gap between BIDS and SSDT? If you have been reading this blog, when it is about BI, I always take help from LinchPin People who are BI experts. I requested Reeves from LinchiPin people to help me answer this unique question.

In this episode of the Notes from the Field series database expert Reeves Smith explains what is Biml and How Can it Help Me with SQL Server BI?. Read the experience of Reeves in his own words.


With all of the new technologies to learn and implement, I wanted to introduce you to some of the benefits of spending the time to learn Business Intelligence Markup Language (Biml). You can use this XML-based language to create and automate Microsoft SQL Server Business Intelligence (BI) objects. Biml currently supports BI objects like SSIS packages and SSAS dimensions, measures, and cubes.

After a quick overview of Biml, I’ll demonstrate how to use it by providing a walkthrough example.

Overview

You can use Biml to create tables, flat files, SSIS packages, and SSAS models. This language is human readable, unlike the XML that is represented within dtsx files and SSAS objects. This readability can help reduce the learning curve required for most programming languages.

Biml will not teach you SQL Server Integration Services (SSIS) or SQL Server Analysis Services (SSAS) but can enhance and increase your productivity with these tools with certain use cases. Biml is not the silver bullet that can solve all of the SSIS/SSAS development project problems, but I will discuss some of the scenarios where Biml excels.

The Biml languages is supported within two Integrated Development Environments (IDEs): Business Intelligence Development Studio (BIDS) and SQL Server Data Tools (SSDT), with the BIDS Helper add-in (free) or the Mist application from Varigence (purchased product).  Currently BIDS Helper is supported with SQL Server 2005, 2008, and 2008 R2 BIDS, and SQL Server 2012 SQL Server Data Tools. This article will focus on the BIDS Helper add-in and its capabilities with SSIS.

Simple Walk Through

This walkthrough will show how to create a Biml file that creates an SSIS package to move data from the AdventureWorks database to a staging database. The following script file will create all of the needed objects. If you have the AdventureWorks database installed and a table called HumanResources.Department, you can skip Listing 1.

CREATE DATABASE [AdventureWorks]
GO
USE [AdventureWorks]
GO
CREATE SCHEMA [HumanResources] AUTHORIZATION [dbo]
GO
CREATE TABLE [HumanResources].[Department]
(
[DepartmentID] [smallint] IDENTITY(1,1) PRIMARY KEY NOT NULL,
[Name] [nvarchar](50) NOT NULL,
[GroupName] [nvarchar](50) NOT NULL,
[ModifiedDate] [datetime] DEFAULT (GETDATE()) NOT NULL
)
ON [PRIMARY]
GO
INSERT INTO [HumanResources].[Department] ([Name], [GroupName]) VALUES (N'Engineering',N'Research and Development')
INSERT INTO [HumanResources].[Department] ([Name], [GroupName]) VALUES (N'Tool Design',N'Research and Development')
INSERT INTO [HumanResources].[Department] ([Name], [GroupName]) VALUES (N'Sales',N'Sales and Marketing')
INSERT INTO [HumanResources].[Department] ([Name], [GroupName]) VALUES (N'Marketing',N'Sales and Marketing')
INSERT INTO [HumanResources].[Department] ([Name], [GroupName]) VALUES (N'Purchasing',N'Inventory Management')
INSERT INTO [HumanResources].[Department] ([Name], [GroupName]) VALUES (N'Research and Development',N'Research and Development')
INSERT INTO [HumanResources].[Department] ([Name], [GroupName]) VALUES (N'Production',N'Manufacturing')
INSERT INTO [HumanResources].[Department] ([Name], [GroupName]) VALUES (N'Production Control',N'Manufacturing')
INSERT INTO [HumanResources].[Department] ([Name], [GroupName]) VALUES (N'Human Resources',N'Executive General and Administration')
INSERT INTO [HumanResources].[Department] ([Name], [GroupName]) VALUES (N'Finance',N'Executive General and Administration')
GO

asdfas
Listing 1 – AdventureWorks Database Objects

CREATE DATABASE [AdventureWorksStage]
GO
USE [AdventureWorksStage]
GO
CREATE SCHEMA [HumanResources] AUTHORIZATION [dbo]
GO
CREATE TABLE [HumanResources].[Department]
(
[DepartmentID] [smallint] PRIMARY KEY NOT NULL,
[Name] [nvarchar](50) NOT NULL,
[GroupName] [nvarchar](50) NOT NULL,
[ModifiedDate] [datetime] NOT NULL
)
ON [PRIMARY]
GO

Listing 2 – AdventureWorksStage Database Objects

To check that you have BIDS Helper installed within your Visual Studio environment, select the Tools menu option and select Options.The BIDS Helper option will display on the left dropdown window if it is installed, as you can see in Figure 1. If you need help installing the BIDS Helper add-in see: Step 2 in the Stairway to Biml: Biml Basics – Level 2 (link).

Figure 1 – Options windows with BIDS Helper installed

After verifying the BIDS Helper is installed, open a new SSIS Project from the File menu, and select New, then Project as Figure 2 shows.

Figure 2 – New Integration Services Project

Due to a Visual Studio limitation, right clicking on the Miscellaneous folder will not bring up the Biml context menu, which you see in Figure 3. Instead, from the Project Window, select the Project folder or the SSIS Packages folder, right click and select Add New Biml File. A new Biml file (BimlScript.biml) is added to the Miscellaneous folder.

Figure 3 – Biml Context Menu

Replace all of the code within the Biml File with the code from Listing 3.

<Biml
 xmlns="http://schemas.varigence.com/biml.xsd">
 <Connections>
 <OleDbConnection Name="AdventureWorks" ConnectionString="Provider=SQLNCLI10;Integrated Security=SSPI;Initial Catalog=AdventureWorks;Data Source=localhost;" />
 <OleDbConnection Name="AdventureWorksStage" ConnectionString="Provider=SQLNCLI10;Integrated Security=SSPI;Initial Catalog=AdventureWorksStage;Data Source=localhost;" />
 </Connections>
 <Packages>
 <Package Name="HumanResources_Department_Biml" ConstraintMode="Linear" >
 <Tasks>
 <Dataflow Name="Data Flow Task">
 <Transformations>
 <OleDbSource Name="OLE DB Source" ConnectionName="AdventureWorks">
 <ExternalTableInput Table="[HumanResources].[Department]" />
 </OleDbSource>
 <OleDbDestination Name="OLE DB Destination" ConnectionName="AdventureWorksStage">
 <InputPath OutputPathName="OLE DB Source.Output" />
 <ExternalTableOutput Table="[HumanResources].[Department]" />
 </OleDbDestination>
 </Transformations>
 </Dataflow>
 </Tasks>
 </Package>
 </Packages>
</Biml>

Listing 3 – Sample Biml code

Note: The connection information in the Biml file must point to the SQL Server instances that contain the AdventureWorks and AdventureWorksStage databases. If you are using another version of SQL Server the provider information might need to be changed: Provider=SQLNCLI10;

After replacing the code in the Biml file with the code in Listing 3, right click the file and choose the Check Biml for Errors menu option shown in Figure 4.

Figure 4 – Check Biml for Errors

If everything is configured correctly, you will receive the message shown in See Figure 5.

Figure 5 – No errors

If you receive any errors, you will need to correct them before selecting the Generate SSIS Package menu option.

Select the Generate SSIS Package menu option to create a new integration services package. A working package will be created within the Packages folder named: HumanResources_Department_Bimlas defined within the Package Name parameter within the Biml file.

Additional Walk-Through Examples

To add an Execute SQL task to truncate the stage table prior to the load add the following code after the <Tasks> node and before the <DataflowName=”Data Flow Task”>

<ExecuteSQLName="Truncate Table"ConnectionName="AdventureWorksStage">
 <DirectInput>TRUNCATE TABLE [HumanResources].[Department]</DirectInput>
</ExecuteSQL>

At this point in the walk-through Biml has not saved you much effort, but what if you wanted to create additional SSIS packages for all for the HumanResources tables within the Adventure works database? How much development effort would it take to update the Biml file to point to the each of the tables in the HumanResources schema and select Generate SSIS Package? Would those packages be consistent with the packages built prior? Would the prior testing of the previous package have a lot of benefit to the newly created package?

At this point copy and paste the Biml code within a new Biml file and move data from other tables. All of the table objects will need to exist, so you might have to add some staging tables to your stage database.

Practical Use

The first question that comes up after starting a discussion on Biml is, “Why would I us Biml?” I can create all of the SSIS packages I need in the current IDE.

One use case for Biml is the automation of similar design patterns like adding more packages to the walkthrough above. SSIS packages typically move data from a source to a destination. This pattern is repeated for each source and destination and can produce multiple SSIS packages. With some SSIS design patterns, the only items that change are the source table name and the destination table name.  What would it be like to spend the time to properly design and test a package to move data from one table to another and then duplicate that package with ease and only change the relevant information?  Biml enables this type of development.  With Biml, you are able to focus the development effort on design and remove some of the repetitive work.

As the variation of the packages increases between each package, Biml can become a less viable option. With Biml Script (a scripting language within Biml), you are able to create designs that vary and can adapt to changing Extract-Transform-Load (ETL) requirements.  Biml Script enables you to programmatically change items within the file without using cut and paste. (Biml Script was not demonstrated within this article.)

Looking Ahead

Biml is much more robust than what was demonstrated in this article but I wanted to start the discussion with a simple example to get you familiar with Biml. Adding Biml Script can create programmatic solutions that can automate SSIS package development. In a future article I will demonstrate how to add Biml Script to the existing code to increase package automation.

To enable follow along with the upcoming articles, install the AdventureWorks database from the following URL:. This database will also give you additional options to test and experiment with.

Side Note: The Mist IDE is able to extend Biml even further by updating multiple SSIS packages. This enables a maintenance option that has never been available within the SSIS development environment. It’s really cool, but something I will have to discuss in the future article

Stay Tuned.

If you want to get started with BIML with the help of experts, read more over at Fix Your SQL Server.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Automate Database Operations for Success – Notes from the Field #035

[Note from Pinal]: This is a 35th episode of Notes from the Field series. Doing this faster and easier is always our goal. We all want to do thin which generates maximum return of investment with least efforts. This is a catch 22 situation quite often when it is about database administrators.

In this episode of the Notes from the Field series database expert Brian Kelley explains a how to automate various database administrator tasks for success of business and our efforts. Read the experience of Brian in his own words.


In the Linchpin People mindset, it’s not about how busy you are, but how valuable you are. You don’t increase your value to your organization by your busyness. You do so by accomplishing the work. There are two parts to this:

  • Accomplish more work.
  • Accomplish more important work.

Initially, if your path follows most people, you’ll be asked to do more. This is your opportunity to accomplish more work. As you succeed with the additional work that you receive, you will likely be given opportunities to take on more important work. If this never happens, you’re probably at the wrong organization. Let’s assume you are at a good one and you’re given more important tasks. Obviously, if you succeed at the more important work, you’ll continue to be given more opportunities. And you’ll likely be given tasks and projects that are more and more important to your organization. This is how you become a linchpin.

So how do you complete more work? One part of the answer is automation. Since we’re IT professionals, automation should be near and dear to us. I recently wrote about being a “lazy” DBA. I used the word lazy was to indicate there are some manual tasks we don’t want to repeat. A “lazy” engineer or IT pro tries to automate these tasks in order to reduce the time spent with these tasks.  That frees up the IT pro to have more time for the more important work.

Let’s look at some things that we should automate as DB Pros:

Database Administration:

Build scripts that can do the following:

  • Check disk space on servers.
  • Check database available space.
  • Check security on key servers and databases.
  • Verify backups were taken properly.
  • Perform test restores of key backups.
  • Parse the SQL error log on each SQL Server for important information (failed logins, use of sp_configure, etc.).

For instance, if I want to check backups to ensure a full backup has run on every database within the last 24 hours, I might automate running the following query to report back the datbases where I do not have the proper backup:

SELECT D.name,
BS.database_name,
Isnull(CONVERT(VARCHAR, BS.lastbackup, 106), ‘No recent backup’) AS
LastBackup
FROM   master.sys.databases AS D
LEFT JOIN (SELECT database_name,
Max(backup_finish_date) AS LastBackup
FROM   msdb.dbo.backupset
WHERE  type = ‘D’
GROUP  BY database_name) ASBS
ON D.name = BS.database_name
WHERE  BS.database_name IS NULL
OR BS.lastbackup < ( Dateadd(hour, -24, Getdate()))
ORDER  BY D.name; 

We should also use automation like policy based management or custom scripts to enforce settings. Some examples that we should consider:

  • database ownership
  • recovery models
  • membership in key roles (sysadmin, securityadmin, db_owner, etc.)

And here if I knew every user database on a given server should be in full recovery mode, I can ensure that if I schedule the following script:

DECLARE cursdbs CURSOR fast_forward FOR
SELECT name
FROM   sys.databases
WHERE  state_desc = ‘ONLINE’
AND recovery_model_desc <> ‘FULL’
AND name NOT IN ( ‘master’, tempdb, msdb, ‘model’);
DECLARE @DBName SYSNAME;
DECLARE @SQL NVARCHAR(max);
OPEN cursdbs;
FETCH next FROM cursdbs INTO @DBName;
WHILE ( @@FETCH_STATUS = 0)
BEGIN
PRINT ‘ALTERING DATABASE: ‘ + @DBName;
SET @SQL = ‘ALTER DATABASE [' + @DBName
+ '] SET RECOVERY FULL;’;
EXEC (@SQL);
FETCH next FROM cursdbs INTO @DBName;
END
CLOSE cursdbs;
DEALLOCATE cursdbs; 

You do want to review that output. After all, if you just switched the DB to full recovery mode, you want to ensure you restart the log backup chain with a full or differential database backup.

Database Development:

Encourage continuous integration methods to include database code. This will require tests to validate no new code “breaks the build.” Make sure that these builds come directly from source control.

If you are doing tests that require restores of databases and the application of scripts, write the automation to do these tasks. It makes the tasks repeatable, it reduces the possibility of error, and it frees you up so you don’t have to manually run each step.

With that said, write scripts for anything you will have to repeat when developing a solution. For instance, you might need scripts to:

  • Add, delete, or change data.
  • Change security on a database or server.
  • Encrypt / decrypt setup data.

Can you automate too much?

Yes, you can. Note that in both cases I did include some reporting. If you build automation where you’re not doing any checking, that’s wrong. Automation eliminates you from having to do tedious steps. It doesn’t remove your responsibility/accountability. If you don’t have anything to check, you don’t actually know if the work was completed successfully. Don’t assume. Check.

If you want to get started with performance tuning and database security with the help of experts, read more over at Fix Your SQL Server.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Windows File/Folder and Share Permissions – Notes from the Field #029

[Note from Pinal]: This is a 29th episode of Notes from the Field series. Security is the task which we should give it to the experts. If there is a small overlook or misstep, there are good chances that security of the organization is compromised. This is very true, but there are always devils’s advocates who believe everyone should know the security. As a DBA and Administrator, I often see people not taking interest in the Windows Security hiding behind the reason of not expert of Windows Server. We all often miss the important mission statement for the success of any organization – Teamwork. In this blog post Brian tells the story in very interesting lucid language. Read On!

In this episode of the Notes from the Field series database expert Brian Kelley explains a very crucial issue DBAs and Developer faces on their production server. Linchpin People are database coaches and wellness experts for a data driven world. Read the experience of Brian in his own words.


When I talk security among database professionals, I find that most have at least a working knowledge of how to apply security within a database. When I talk with DBAs in particular, I find that most have at least a working knowledge of security at the server level if we’re speaking of SQL Server. One area I see continually that is weak is in the area of Windows file/folder (NTFS) and share permissions.

The typical response is, “I’m a database developer and the Windows system administrator is responsible for that.” That may very well be true – the system administrator may have the primary responsibility and accountability for file/folder and share security for the server. However, if you’re involved in the typical activities surrounding databases and moving data around, you should know these permissions, too. Otherwise, you could be setting yourself up where someone is able to get to data he or she shouldn’t, or you could be opening the door where human error puts bad data in your production system.

File/Folder Permission Basics:

I wrote about file/folder permissions a few years ago to give the basic permissions that are most often seen. Here’s what you must know as a minimum at the file/folder level:

  • Read - Allows you to read the contents of the file or folder. Having read permissions allows you to copy the file or folder.
  • Write  – Again, as the name implies, it allows you to write to the file or folder. This doesn’t include the ability to delete, however, nothing stops a person with this access from writing an empty file.
  • Delete - Allows the file/folder to be deleted. If you overwrite files, you may need this permission.
  • Modify - Allows read, write, and delete.
  • Full Control - Same as modify + the ability to assign permissions.

File/Folder permissions aggregate, unless there is a DENY (where it trumps, just like within SQL Server), meaning if a person is in one group that gives Read and antoher group that gives Write, that person has both Read and Write permissions.

As you might expect me to say, always apply the Principle of Least Privilege. This likely means that any additional permission you might add does not need Full Control.

Share Permission Basics:

At the share level, here are the permissions.

  • Read - Allows you to read the contents on the share.
  • Change - Allows you to read, write, and delete contents on the share.
  • Full control - Change + the ability to modify permissions.

Like with file/folder permissions, these permissions aggregate, and DENY trumps.

So What Access Does a Person / Process Have?

Figuring out what someone or some process has depends on how the location is being accessed:

  • Access comes through the share (\\ServerName\Share) – a combination of permissions is considered.
  • Access is through a drive letter (C:\, E:\, S:\, etc.) – only the file/folder permissions are considered.

The only complicated one here is access through the share. Here’s what Windows does:

  1. Figures out what the aggregated permissions are at the file/folder level.
  2. Figures out what the aggregated permissions are at the share level.
  3. Takes the most restrictive of the two sets of permissions.

You can test this by granting Full Control over a folder (this is likely already in place for the Users local group) and then setting up a share. Give only Read access through the share, and that includes to Administrators (if you’re creating a share, likely you have membership in the Administrators group). Try to read a file through the share. Now try to modify it. The most restrictive permission is the Share level permissions. It’s set to only allow Read. Therefore, if you come through the share, it’s the most restrictive.

Does This Knowledge Really Help Me?

In my experience, it does. I’ve seen cases where sensitive files were accessible by every authenticated user through a share. Auditors, as you might expect, have a real problem with that. I’ve also seen cases where files to be imported as part of the nightly processing were overwritten by files intended from development. And I’ve seen cases where a process can’t get to the files it needs for a process because someone changed the permissions.

If you know file/folder and share permissions, you can spot and correct these types of security flaws. Given that there are a lot of database professionals that don’t understand these permissions, if you know it, you set yourself apart. And if you’re able to help on critical processes, you begin to set yourself up as a linchpin (link to .pdf) for your organization.

If you want to get started with performance tuning and database security with the help of experts, read more over at Fix Your SQL Server.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Say No to DB Data Roles – SQL Security – Notes from the Field #022

[Note from Pinal]: This is a 22nd episode of Notes from the Field series. Security is very important and we all realize that. However, when it is about implementing the security, we all are not sure what is the right path to take. If we do not have enough knowledge, we can damage ourself only. DB Data Roles are very similar concept, when implemented poorly it can compromise your server security.

In this episode of the Notes from the Field series database expert Brian Kelley explains a very crucial issue DBAs and Developer faces on their production server. Linchpin People are database coaches and wellness experts for a data driven world. Read the experience of Brian in his own words.


I am prejudiced against two fixed database roles: db_datareader and db_datawriter. When I give presentations or talk to customers, some are surprised by my stance. I have two good reasons to recommend against these two roles (and their counterparts, db_denydatareader and db_denydatawriter).

A Violation of the Principle of Least Privilege

The first reason is they violate the Principle of Least Privilege. If you’re not familiar with this security principle, it’s really simple: give permissions to do the job – no more and no less. The db_datareader and db_datawriter roles give access to all tables and views in a given database. Most of the time, this is more access than what is needed. This is a violation of the Principle of Least Privilege.

There are some cases where a user needs such access, but there is always the possibility that a new table or view will be added which the user should not have access to. This creates a dilemma: do I create new roles and remove the user from db_datareader or db_datawriter or do I start using DENY permissions? The first involves additional work.The second means the security model is more complex. Neither is a good solution.

Failing the 3 AM Test

The second reason is the use of these roles violates what I call the “3 AM test.” The 3 AM test comes from being on call. When I am awakened at 3 AM because of a production problem, is this going to cause me unnecessary problems? If the answer is yes, the solution fails the test. I classify db_datareader and db_datawriter role usage as failing this test. Here’s why: the permissions granted are implicit. As a result, when I’m still trying to wake up I may miss that a particular account has permissions and is able to perform an operation that caused the problem. I’ve been burned by it in production before. That’s why it fails my test.

An Example

To see why this is an issue, create a user without a login in a sample database. Make it a member of the db_datareader role. Then create a role and give it explicit rights to a table in the database. This script does so in the AdventureWorks2012 database:
USE AdventureWorks2012;
GO
CREATE USER TestDBRoleUser WITHOUT LOGIN;
GO
EXEC sp_addrolemember @membername = 'TestDBRoleUser', @rolename = 'db_datareader';
GO
CREATE ROLE ExplicitPermissions;
GO
GRANT SELECT ON HumanResources.Employee TO ExplicitPermissions;
GO

Pick any table or view at random and check the permissions on it. I’m using HumanResources.Employee:

We see the permissions for the role with explicit permissions. We don’t, however, see the user who is a member of db_datareader. When first troubleshooting it’s easy to make the assumption that the user doesn’t have permissions. This assumption means time is wasted trying to figure out how the user was able to cause the production problem. Only later, when someone things to check db_datareader, will the root cause be spotted. This is why I say these roles fail the 3 AM test.

If you want to get started with performance tuning and database security with the help of experts, read more over at Fix Your SQL Server.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – Auditors, Passwords and Exceptions – Notes from the Field #010

[Note from Pinal]: This is a 10th episode of Notes from the Field series. Every other day, I read online about various security breach happening worldwide – some have lost passwords, some have lost phone numbers and some have lost credit card information. Security is the most important aspect of the application. The worst scenario would be when theives comes to your door and you have kept your key under your door-mat, which they are able to guess. Password is just like key to get into the database. If you keep your password so easy that everyone can guess, it would be very bad for your system. You need to keep it complex and also follow your organization’s policy. In this blog post Brian hits very important concept about application security.

In this episode of the Notes from the Field series database expert Brian Kelley explains a very crucial issue DBAs and Developer faces on their production server. Linchpin People are database coaches and wellness experts for a data driven world. Read the experience of Brian in his own words.


Auditors focus on logins, especially the password settings. Why? They do because it’s easy to audit and those audits find that too many organizations still get them wrong. With respect to SQL Server, you can’t control things like how many days before the password expires or what the complexity rules are. Those are controlled by the operating system and SQL Server follows the operating system’s rules. What can be controlled is if a login follows the login policy rules. There are actually three options here:

  • The login doesn’t follow any of the policy rules.
  • The login follows all of the policy rules.
  • The login follows all of the policy rules except for password expiration.

Auditors are interested in exceptions. When something doesn’t follow the rules or doesn’t follow all of the rules, an auditor gets interested. If you’re asked to show the auditors evidence of what a login follows, there are two ways to do this: via the GUI or via a script. In many cases auditors will ask for a screenshot that looks something like:

Auditing SQL Login (GUI)

Note the areas I’ve highlighted. This is the information most auditors are looking for with respect to logins that are created within SQL Server (not domain accounts like domain users or groups). If you only have a small number of logins, then using the GUI is not very time consuming. If you have a lot of SQL logins or SQL Servers to audit, however, you want to script this. While an auditor may ask for a screenshot, what an auditor really wants is the information contained in the screenshot. Clicking through every login and taking a screenshot is a waste of time. Thankfully, there is a particular catalog view, sys.sql_logins, that contains all the information the auditor will want.

The catalog view sys.sql_logins contains all the same information as sys.server_principals plus other information that is applicable only to SQL Server-based logins. What’s more, it only contains the SQL Server-based logins. It doesn’t contain any information on Windows users and groups. Therefore, it’s the right catalog view to use to satisfy the auditors asking about logins and passwords. This query will return the information they need:

SELECT name, is_disabled,
is_policy_checked, is_expiration_checked
FROM sys.sql_logins;

You can still give them a screenshot, however instead of giving them a screenshot of each and every login, you should give them the screenshot of the query and the results of that query. In the following screenshot, I’ve flagged two logins that would be of interest to the auditors because these do not follow the password policy (is_policy_checked = 0).

SQL Login info by Query

If you have a lot of SQL Servers, you could simply automate this query with a PowerShell script that queries each server and dumps the results to a text file. In that way you save a lot of time providing the results needed, freeing you up for other tasks. Remember, auditors need the information, they don’t necessarily need a screenshot for each and every login.

If you want to get started with performance tuning and database security with the help of experts, read more over at Fix Your SQL Server.

Reference: Pinal Dave (http://blog.sqlauthority.com)

SQL SERVER – What Permissions I have on Database I am Connected to?

Here is the first thing I do when I get access to any new server – I check what are different permissions I have with respect to the database I am connected with. The matter of the fact is that most of the database consultants and administrator want to know what are different permissions they have on the database when they get access to any new system. I personally want to own only the permissions which help me to accomplish my work, not a single permission more or less. If I have more permissions, I request the team to take away so I do not make any mistakes accidently.

Here is the script which you can execute to get a list of the permissions over databases.

SELECT * FROM fn_my_permissions (NULL, 'DATABASE');
GO

Here is the result set of above query when I executed with sa or admin access.

Here is the result set of above query when I executed with public access.

Reference: Pinal Dave (http://blog.sqlauthority.com)