Laerte Junior is my very dear friend and Powershell Expert. On my request he has agreed to share Powershell knowledge with us. Laerte Junior is a SQL Server MVP and, through his technology blog and simple-talk articles, an active member of the Microsoft community in Brasil. He is a skilled Principal Database Architect, Developer, and Administrator, specializing in SQL Server and Powershell Programming with over 8 years of hands-on experience. He holds a degree in Computer Science, has been awarded a number of certifications (including MCDBA), and is an expert in SQL Server 2000 / SQL Server 2005 / SQL Server 2008 technologies. Let us read the blog post in his own words.
I was reading an excellent post from my great friend Pinal about loading data from CSV files, SQL SERVER – Importing CSV File Into Database – SQL in Sixty Seconds #018 – Video,  to SQL Server and was honored to write another guest post on SQL Authority about the magic of the PowerShell.
The biggest stuff in TechEd NA this year was PowerShell. Fellows, if you still don’t know about it, it is better to run. Remember that The Core Servers to SQL Server are the future and consequently the Shell. You don’t want to be out of this, right?
Save it in a module and add it in your profile. In my case, the module is called functions.psm1.
To have some data to play, I created 10 csv files with the same content. I just put the SQL Server Errorlog into a csv file and created 10 copies of it.
#Just create a CSV with data to Import. Using SQLErrorLog [reflection.assembly]::LoadWithPartialName("Microsoft.SqlServer.Smo") $ServerInstance=new-object ("Microsoft.SqlServer.Management.Smo.Server") $Env:Computername $ServerInstance.ReadErrorLog() | export-csv-path"c:\SQLAuthority\ErrorLog.csv"-NoTypeInformation for($Count=1;$Count-le 10;$count++) { Copy-Item"c:\SQLAuthority\Errorlog.csv""c:\SQLAuthority\ErrorLog$($count).csv" }
Now in my path c:\sqlauthority, I have 10 csv files :
Now it is time to create a table. In my case, the SQL Server is called R2D2 and the Database is SQLServerRepository and the table is CSV_SQLAuthority.
CREATE TABLE [dbo].[CSV_SQLAuthority]( [LogDate] [datetime] NULL, [Processinfo] [varchar](20) NULL, [Text] [varchar](MAX) NULL )
Let’s play a little bit.
I want to import synchronously all csv files from the path to the table:
#Importing synchronously $DataImport=Import-Csv-Path ( Get-ChildItem"c:\SQLAuthority\*.csv") $DataTable=Out-DataTable-InputObject$DataImport Write-DataTable-ServerInstanceR2D2-DatabaseSQLServerRepository-TableNameCSV_SQLAuthority-Data$DataTable
Very cool, right? Let’s do it asynchronously and in background using PowerShell Jobs:
#If you want to do it to all asynchronously Start-job-Name'ImportingAsynchronously' ` -InitializationScript {IpmoFunctions-Force-DisableNameChecking} ` -ScriptBlock { ` $DataImport=Import-Csv-Path ( Get-ChildItem"c:\SQLAuthority\*.csv") $DataTable=Out-DataTable-InputObject$DataImport Write-DataTable -ServerInstance"R2D2"` -Database"SQLServerRepository"` -TableName"CSV_SQLAuthority"` -Data$DataTable }
Oh, but if I have csv files that are large in size and I want to import each one asynchronously. In this case, this is what should be done:
Get-ChildItem"c:\SQLAuthority\*.csv" | % { Start-job-Name"$($_)" ` -InitializationScript {IpmoFunctions-Force-DisableNameChecking} ` -ScriptBlock { $DataImport=Import-Csv-Path$args[0] $DataTable=Out-DataTable-InputObject$DataImport Write-DataTable-ServerInstance"R2D2"` -Database"SQLServerRepository"` -TableName"CSV_SQLAuthority"` -Data$DataTable } -ArgumentList$_.fullname }
How cool is that?
Let’s make the fun stuff now. Let’s schedule it on an SQL Server Agent Job.
If you are using SQL Server 2012, you can use the PowerShell Job Step. Otherwise, you need to use a CMDexec job step calling PowerShell.exe. We will use the second option.
First, create a ps1 file called ImportCSV.ps1 by the script above and save it in a path. In my case, it is in c:\temp\automation. Just add the line at the end:
Get-ChildItem"c:\SQLAuthority\*.csv" | % { Start-job-Name"$($_)" ` -InitializationScript {IpmoFunctions-Force-DisableNameChecking} ` -ScriptBlock { $DataImport=Import-Csv-Path$args[0] $DataTable=Out-DataTable-InputObject$DataImport Write-DataTable-ServerInstance"R2D2"` -Database"SQLServerRepository"` -TableName"CSV_SQLAuthority"` -Data$DataTable } -ArgumentList$_.fullname } Get-Job | Wait-Job | Out-Null Remove-Job -State Completed
Remember, this trick is for  ALL scripts that will use PowerShell Jobs and any kind of schedule tool (SQL Server Agent, Windows Schedule)
Create a Job Called ImportCSV and a step called Step_ImportCSV and choose CMDexec.
Then you just need to schedule or run it.
I did a short video (with matching good background music) and you can see it at:
That’s it guys. C’mon, join me in the #PowerShellLifeStyle.
Reference:Â Pinal Dave (https://blog.sqlauthority.com)