Laerte Junior is my very dear friend and Powershell Expert. On my request he has agreed to share Powershell knowledge with us. Laerte Junior is a SQL Server MVP and, through his technology blog and simple-talk articles, an active member of the Microsoft community in Brasil. He is a skilled Principal Database Architect, Developer, and Administrator, specializing in SQL Server and Powershell Programming with over 8 years of hands-on experience. He holds a degree in Computer Science, has been awarded a number of certifications (including MCDBA), and is an expert in SQL Server 2000 / SQL Server 2005 / SQL Server 2008 technologies. Let us read the blog post in his own words.
I was reading an excellent post from my great friend Pinal about loading data from CSV files, SQL SERVER – Importing CSV File Into Database – SQL in Sixty Seconds #018 – Video, to SQL Server and was honored to write another guest post on SQL Authority about the magic of the PowerShell.
The biggest stuff in TechEd NA this year was PowerShell. Fellows, if you still don’t know about it, it is better to run. Remember that The Core Servers to SQL Server are the future and consequently the Shell. You don’t want to be out of this, right?
Save it in a module and add it in your profile. In my case, the module is called functions.psm1.
To have some data to play, I created 10 csv files with the same content. I just put the SQL Server Errorlog into a csv file and created 10 copies of it.
#Just create a CSV with data to Import. Using SQLErrorLog [reflection.assembly]::LoadWithPartialName("Microsoft.SqlServer.Smo") $ServerInstance=new-object ("Microsoft.SqlServer.Management.Smo.Server") $Env:Computername $ServerInstance.ReadErrorLog() | export-csv-path"c:\SQLAuthority\ErrorLog.csv"-NoTypeInformation for($Count=1;$Count-le 10;$count++) { Copy-Item"c:\SQLAuthority\Errorlog.csv""c:\SQLAuthority\ErrorLog$($count).csv" }
Now in my path c:\sqlauthority, I have 10 csv files :
Now it is time to create a table. In my case, the SQL Server is called R2D2 and the Database is SQLServerRepository and the table is CSV_SQLAuthority.
CREATE TABLE [dbo].[CSV_SQLAuthority]( [LogDate] [datetime] NULL, [Processinfo] [varchar](20) NULL, [Text] [varchar](MAX) NULL )
Let’s play a little bit.
I want to import synchronously all csv files from the path to the table:
#Importing synchronously $DataImport=Import-Csv-Path ( Get-ChildItem"c:\SQLAuthority\*.csv") $DataTable=Out-DataTable-InputObject$DataImport Write-DataTable-ServerInstanceR2D2-DatabaseSQLServerRepository-TableNameCSV_SQLAuthority-Data$DataTable
Very cool, right? Let’s do it asynchronously and in background using PowerShell Jobs:
#If you want to do it to all asynchronously Start-job-Name'ImportingAsynchronously' ` -InitializationScript {IpmoFunctions-Force-DisableNameChecking} ` -ScriptBlock { ` $DataImport=Import-Csv-Path ( Get-ChildItem"c:\SQLAuthority\*.csv") $DataTable=Out-DataTable-InputObject$DataImport Write-DataTable -ServerInstance"R2D2"` -Database"SQLServerRepository"` -TableName"CSV_SQLAuthority"` -Data$DataTable }
Oh, but if I have csv files that are large in size and I want to import each one asynchronously. In this case, this is what should be done:
Get-ChildItem"c:\SQLAuthority\*.csv" | % { Start-job-Name"$($_)" ` -InitializationScript {IpmoFunctions-Force-DisableNameChecking} ` -ScriptBlock { $DataImport=Import-Csv-Path$args[0] $DataTable=Out-DataTable-InputObject$DataImport Write-DataTable-ServerInstance"R2D2"` -Database"SQLServerRepository"` -TableName"CSV_SQLAuthority"` -Data$DataTable } -ArgumentList$_.fullname }
How cool is that?
Let’s make the fun stuff now. Let’s schedule it on an SQL Server Agent Job.
If you are using SQL Server 2012, you can use the PowerShell Job Step. Otherwise, you need to use a CMDexec job step calling PowerShell.exe. We will use the second option.
First, create a ps1 file called ImportCSV.ps1 by the script above and save it in a path. In my case, it is in c:\temp\automation. Just add the line at the end:
Get-ChildItem"c:\SQLAuthority\*.csv" | % { Start-job-Name"$($_)" ` -InitializationScript {IpmoFunctions-Force-DisableNameChecking} ` -ScriptBlock { $DataImport=Import-Csv-Path$args[0] $DataTable=Out-DataTable-InputObject$DataImport Write-DataTable-ServerInstance"R2D2"` -Database"SQLServerRepository"` -TableName"CSV_SQLAuthority"` -Data$DataTable } -ArgumentList$_.fullname } Get-Job | Wait-Job | Out-Null Remove-Job -State Completed
Remember, this trick is for ALL scripts that will use PowerShell Jobs and any kind of schedule tool (SQL Server Agent, Windows Schedule)
Create a Job Called ImportCSV and a step called Step_ImportCSV and choose CMDexec.
Then you just need to schedule or run it.
I did a short video (with matching good background music) and you can see it at:
That’s it guys. C’mon, join me in the #PowerShellLifeStyle.
Reference: Pinal Dave (https://blog.sqlauthority.com)
10 Comments. Leave new
My Friend, Many Thanks for the opportunity to spread the words of PowerShell and of course to be honored again with a guest post on SQLAuthority. Priceless :)
Hi Laerte Junior,
Thank you for this article.
Regard$
Chirag Satasiya
Thank you!
how to import data from ms access to sql server2005..how we can schedule that job?
I´m a big fan of the SQL Server and also from your blog.
If there´s a question I try to find the answer here.
Good article. But using Import-CSV imports everything as text/strings and doesn’t maintain any object-types. How did you manage to import it with object types?
Thank you and It helps me a lot.
I have a question that
How can I skip or say “error” on SQL server when the data does not match with data type?
such as ID has nchar(2) on SQL server and I have csv file that one of data is ‘1234’ for ID.
How can I have error sign or error report for that data in SQL server.
It is ok that I can have separate excel file about all the errors.
I just want to know which data got error.
And make the script without stopping even there is an error.
Some questions :) please reply if you have some time thank you
When we start to take it Async I get an error on the following line.
-InitializationScript {Ipmo Functions -Force -DisableNameChecking} `
The error is saying that it doesn’t recognize the module called “Functions”, but this is the code you use to import from a ScriptBlock. I’m not clear as to why we’re specifying a keyword of “Functions”, but it doesn’t cause an error in your video and you use it all over.
I tried on v2 (2003 Server 32bit) and v4 (win7 32bit) with the same results
Error:
Import-Module : The specified module ‘Functions’ was not loaded because no valid module file was found in any module directory.
At line:1 char:5
+ Ipmo <<<< Functions -Force -DisableNameChecking
+ CategoryInfo : ResourceUnavailable: (Functions:String) [Import-Module], FileNotFoundException
+ FullyQualifiedErrorId : Modules_ModuleNotFound,Microsoft.PowerShell.Commands.ImportModuleCommand
Full code:
Start-job -Name ‘ImportingAsynchronously‘ `
-InitializationScript {Ipmo Functions -Force -DisableNameChecking} `
-ScriptBlock { `
$DataImport= Import-Csv -Path ( Get-ChildItem “c:SQLAuthority*.csv”)
$DataTable= Out-DataTable -InputObject $DataImport
Write-DataTable -ServerInstance "Development02" `
-Database "Reporting" `
-TableName "CSV_SQLAuthority" `
-Data $DataTable
}
What is Write-DataTable ?
https://gallery.technet.microsoft.com/scriptcenter/2fdeaf8d-b164-411c-9483-99413d6053ae