This is a very common request recently – How to import CSV file into SQL Server? How to load CSV file into SQL Server Database Table? How to load comma delimited file into SQL Server? Let us see the solution in quick steps.
CSV stands for Comma Separated Values, sometimes also called Comma Delimited Values.
Create TestTable
USE TestData
GO
CREATE TABLE CSVTest
(ID INT,
FirstName VARCHAR(40),
LastName VARCHAR(40),
BirthDate SMALLDATETIME)
GO
Create CSV file in drive C: with name sweetest. text with the following content. The location of the file is C:\csvtest.txt
1,James,Smith,19750101
2,Meggie,Smith,19790122
3,Robert,Smith,20071101
4,Alex,Smith,20040202

Now run following script to load all the data from CSV to database table. If there is any error in any row it will be not inserted but other rows will be inserted.
BULK
INSERT CSVTest
FROM 'c:\csvtest.txt'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
GO
--Check the content of the table.
SELECT *
FROM CSVTest
GO
--Drop the table to clean up database.
DROP TABLE CSVTest
GO

Reference : Pinal Dave (https://blog.sqlauthority.com)




839 Comments. Leave new
Good one.
It solves my problem.
Keep it up. Pinal.
Can i insert current datetime using CSV .
Dear Sir, All
I want to insert multiple text files in to SQL server from a single folder (D:\ICEGATE\NewFolder\) WITH (ROWTERMINATOR = ”), my dbo is ALL_VALUE & my table is ALL_VAL
I also have a similar concern. Daily files are dropped into an FTP location. Each are in a CSV format and have a file name that is incrementally set based on the date and other constraints. Each file has exactly the same column headings for the row data that follows. So, to automate this so that the 15 or 20daily files, each with a unique name that are received each day can be done in one operation to update to the SQL Server table rather than one at a time is a huge time saver. Any thoughts or solutions are greatly appreciated!
Hi Bruzer,
i have the same problem, did you find the workaround?
Thanks in advance :)
Heyy!! Have you found any solutions to it?
SSIS
Can anybody help me
Export excel sheet to SQL Database(Remote Server)
insert INTO tbL_excel
SELECT *
FROM OPENROWSET(‘Microsoft.Jet.OLEDB.4.0’,
‘Excel 8.0;Database=\\pc31\C\Testexcel.xls’,
‘SELECT * FROM [Sheet1$]’)
rajesh u have to use import export wizard
Hi Rajesh,
perform following steps:
1. open your excel file.
2. save file as [name].csv
3.execute the sql statement given by pinal.
if any one help please send that to my mail
Dear Rajesh,
insert INTO tbL_excel
SELECT *
FROM OPENROWSET(‘Microsoft.Jet.OLEDB.4.0′,
‘Excel 8.0;Database=\\pc31\C\Testexcel.xls’,
‘SELECT * FROM [Sheet1$]‘)
The above query posted by you is having a small mistake and the query follows:
SELECT * INTO TBL_EXCEL
OPENROWSET(‘MSDASQL’, ‘DRIVER ={Microsoft Excel Driver (*.xls)}’, ‘DBQ = PATH_NAME’,
‘SELECT * FROM [Sheet1$]’)
Hi Vishnu,
Thanks for Updated Query.
But it has small mistake in there and gives error message ‘Invalid syntax near OPENROWSET’
so I’h use both of above combination of query is as follow:
SELECT * INTO TBL_EXCEL
FROM OPENROWSET(‘Microsoft.Jet.OLEDB.4.0’,
‘Excel 8.0;Database=C:\Source\Addresses.xls’,
‘SELECT * FROM [Sheet1$]’)
and it work perfectly
Thanks and Regards
Try to change “\\pc31\C” by the drive and folder of your server location.
Server: Msg 7399, Level 16, State 1, Line 1
OLE DB provider ‘Microsoft.Jet.OLEDB.4.0’ reported an error. The provider did not give any information about the error.
OLE DB error trace [OLE/DB Provider ‘Microsoft.Jet.OLEDB.4.0’ IDBInitialize::Initialize returned 0x80004005: The provider did not give any information about the error.].
INSERT INTO dbo.ImportTest
SELECT * FROM OPENDATASOURCE(‘Microsoft.Jet.OLEDB.4.0’,
‘Data Source=C:\ImportData.xls;Extended Properties=Excel 8.0’)…[Sheet1$
–Drop the table to clean up database.
SELECT *
FROM CSVTest
GO
Of course, it won’t.
Server: Msg 4860, Level 16, State 1, Line 1
Could not bulk insert. File ‘c:\DAM.txt’ does not exist.
(0 row(s) affected)
I am getting this error. I have tried the same query as given by Pinal. So, pls. guide me.
You will need an UNC path instead of “C:\”.
MAke sure that the file exists in server’s directory
Hi… Where server’s directory in pc???
I have the following error when testing this: cannot bulk load becuase the file CL\Users\Public\Downloads\CSVTest.Txt could not be opened. Operating system error code 3 (the system cannot file the path specified)
Anyone run into this?
.txt – there’s your issue!
what if data is like this:
1,James,Smith,19750101,”dsf,sdf,sdf,”
2,Meggie,Smith,19790122,”dsf,sdf,sdf,”
3,Robert,Smith,20071101,”dsf,sdf,sdf,”
4,Alex,Smith,20040202,”dsf,sdf,sdf,”
Thanks for your great share.
How about csv files that have double quotes? Seems like a formatfile will be needed when using BULK INSERT. Would like to see if you have another approach.
bulk insert
from
with
(
FIELDTERMINATOR = ‘”,”‘,
ROWTERMINATOR = ‘”‘
)
also make sure you dont have an extra carriage return is at the end of the file
since im replying for a post that is more then 3 yrs old, somebody mught benefit from this reply
Regards,
Bhuvana
Yes this helped. Thanks for posting.
Not all of my fields have double quotes around them, only the text quotes.
Any idea for the solution to this?
Many thanks,
not all of my csv cells are surrounded by double quotes only the string columns, is there another way to do this?
not all of my csv cells are surrounded by double quotes only the string columns the integer columns arent delimited using this method, is there a way around this?
My field terminator is ‘~’ and row terminator is ‘n’. What would my query look like??
Hi all,
I am getting following error in the bulk insert of the .csv file.
The bulk load failed. The column is too long in the data file for row 1, column 25. Verify that the field terminator and row terminator are specified correctly.
Please help me.
Dnyanesh
It means that the actula length of the column is less than the data available in the csv file. Increase the size and try again
hello Madhivana,
i have the sane matter. my code look like:
CREATE TABLE CSVTest
(msisdn INT,
field2 varchar(40),
fild3 varchar(40),
fild4 varchar(40),
tariff VARCHAR(100),
fild6 varchar(40),
fild7 varchar(40),
fild8 date,
fild9 date,
fild10 varchar(100),
fild11 date,
fild12 date,
service_type nVARCHAR(3000))
GO
BULK
INSERT CSVTest
FROM ‘d:\Base_Client_08052016.csv’
WITH
(
FIELDTERMINATOR = ‘;’,
ROWTERMINATOR = ‘\n’
)
GO
error mesages are:
Msg 4866, Level 16, State 1, Line 1
The bulk load failed. The column is too long in the data file for row 1, column 13. Verify that the field terminator and row terminator are specified correctly.
Msg 7399, Level 16, State 1, Line 1
The OLE DB provider “BULK” for linked server “(null)” reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 1
Cannot fetch a row from OLE DB provider “BULK” for linked server “(null)”.
any help is appreciated
What would be the process to import a CSV file into existing MS SQL database tables? I have a CSV file that contains data which is comma delimited that I need to import into a SQL production database which is being used by another application. The CSV file contains a list of defects that basically needs to be imported into the Defects table in our SQL database.
Thanks for all of your help in advanced.. This is a really good website to obtain SQL information.
You have to create a view then bulk insert into the view. You can leave the auto int field out and then it would insert into each one. Or you can just create the fields you plan on inserting. For example.
USE [DatabaseName]
— See if the View exists within the Database
IF EXISTS ( SELECT name
FROM sys.views
WHERE name = ‘YourViewName’
) DROP VIEW YourViewName
GO
CREATE VIEW YourViewName
AS
SELECT Field1, Field2, Field3
FROM PriorTableThatExists
GO
BULK INSERT YourViewName
FROM ‘C:\YourFilePath’
WITH ( FIELDTERMINATOR = ‘,’,
ROWTERMINATOR = ‘\n’
)
GO
I had some problems with bulk inserts too. The data set to insert was somewhat large, around a few hundred megabytes, resulted in some two million rows. Not all data was required, so massaging was needed.
So I wrote a smallish C#/.NET program that reads the input file in chunks of 25,000 rows, picked appropriate rows with regular expression matching and sent results to DB.
String manipulation is so much more easy in about any general-purpose programming language.
Key components: DataTable for data-to-be-inserted and SqlBulkCopy to do the actual copying.
@Dnyanesh: Your input data is likely to not have correct column/row separator characters.
I’d guess you have a line-break as row separator. If that is the case, the problem is, in Windows CR LF (0x0a 0x0d) is often used, in other systems only CR or LF are used. Check your separator characters. Use a hex editor to be sure.
@Branden: Just use sed to change quote characters :-)
@Dragan: Use Import/Export wizard or write a program to do inserting like I did.
A CSV file can contain commas in a field ‘,’ as well as be used as the delimiter. How does this Bulk import handle these circumstances? Does it handle all the CSV standards?
@vonPryz: See Example Below (don’t just replace quotes)
Example CSV lines with format (Name,Address):
Scott,123 Main St
Mike,”456 2nd St, Apt 5″
More on CSV files can be found here
https://en.wikipedia.org/wiki/Comma-separated_values
Hi Scott – did you get an answer to this question? I have a similar situation where some of the data has ” ” around it while other data does not.
You can change your delimiter in to a “pipe” | . Then change your SQL to use “|” as your FIELDTERMINATOR.
1.
In Notepad, I did a “find and replace all” for “,” (double quote, comma, double quote) to be replaced with pipe. Then did a “find and replace all” for ” (double quote) to be replaced with nothing.
2.
Update your SQL to read:
BULK
INSERT CSVTest
FROM ‘c:\csvtest.txt’
WITH
(
FIELDTERMINATOR = ‘|’,
ROWTERMINATOR = ‘\n’
)
GO
Another thing:
When replacing the delimiters in your text file:
1. Start out with a “find and replace all” for “,” (double quote, comma, double quote) to be replaced with pipe.
2. Then remove all the beginning and ending ” (double quote) with a “find and replace all” for ” (double quote) to be replaced with nothing.
3. Then to make sure you have accounted for empty columns in the text file:
i. do a “find and replace all” for ,,, (comma comma comma) to be replaced
by ||| (pipe pipe pipe)
ii. next do a “find and replace all” for ,, (comma comma) to be replaced
by || (pipe pipe)
Don’t ever try to do a “find and replace all” for a single comma – or else you may be updating comma characters within columns.
You should be able to simply use Excel to replace the delimiters and remove double quotes, then re-save file. But this the my old school methodology I’ve been doing for years.
ANOTHER PROBLEM: HELP
Hey everyone… can anyone help me with this… i had read a txt file with the bulk sintax but after that i find another problem, in my txt file it has the same header and detail in it so my problem now is how i can to put the header in a table1 and the detail in a table2, another thing i create just a field1 for put all my rows(header and detail), but they have FIELDTERMINATOR = ‘|’, but the header it has seven fields and the detail it has eleven fields so thats another problem because i cannot create just one table with the exact fields to put my detail and header… this an example of a txt file:
HR|001580/06|RG|11/01/2013 12:00|BG|3573|001580
IT|001580/01|1|00147066||1200|852.3|830.3|1.35|UNIDAD|0|31/12/2014 00:00
IT|001580/02|1|00147066||200|852.3|830.3|1.35|UNIDAD|1|31/12/2014 00:00
IT|001580/03|1|00147066||100|852.3|830.3|1.35|UNIDAD|55|31/12/2014 00:00
IT|001580/04|2|00254276||200|852.3|830.3|1.35|UNIDAD|0|31/12/2014 00:00
IT|001580/05|3|00305359||1700|852.3|830.3|1.35|UNIDAD|0|31/12/2014 00:00
IT|001580/06|3|00305359||300|852.3|830.3|1.35|UNIDAD|1|31/12/2014 00:00
and this is the query i have…
CREATE TABLE #temp
(
campo1 VARCHAR(max)–,
–campo2 VARCHAR(max),
–campo3 VARCHAR(max),
–campo4 VARCHAR(max),
–campo5 VARCHAR(max),
–campo6 VARCHAR(max),
–campo7 VARCHAR(max)
)
BULK INSERT #temp
FROM ”
WITH
(
FIELDTERMINATOR = ‘|’,
ROWTERMINATOR = ‘\n’
)
This firs query it is for the header
SELECT *
FROM #temp
WHERE SUBSTRING(campo1,1,2) = ‘HR’
This firs query it is for the detail
SELECT *
FROM #temp
WHERE SUBSTRING(campo1,1,2) = ‘IT’
so then come my problem how i can Separate the fields for the header and the detail.
Thanks for all in this forum,
I have used BULK INSERT command load CSV file into existing SQL server table. It is executing sucessfully withougt giving any error but only concern is it is inserting every alternative records.
BULK INSERT EDFE_Trg.dbo.[tblCCodeMast]
FROM ‘c:\temp\Catalogue.txt’
WITH
(
FIELDTERMINATOR = ‘,’,
ROWTERMINATOR = ‘\n’
)
GO
this is my code. Please help me on this.
With advance thanks…
kumar
How did you know it is inserting alternate records?
Run this before and after import to see if recordcount is correct
select count(*) from table
thank you very mush for this,
it willy helped me.
regards
Hi,
I need to import a significant number of Latitude and Longitude data into one of my SQL tables… (running SQL2005)
Tried the above, but got you do not have permission to use Bulk Load statement.
Logged in as administrator… any thoughts?
Martin
ok…solced that issue…logged in using Windows Authentication, and gave my SQL admin the rights for BULKINSERT
SQL code now runs with no errors… and if I do a select * from the table, it shows the field headings, but no data … 0 rows affected
Any thoughts? This one has me stumped… no error message leaves me nowhere to look
Cheers
Did the fie have data? Also did you get any errors/warnings during bulk insert?
Hi Pinal
Please help me
Iam using sql server 2005,..I want to do a similar operation..
I have my file in C:\Test.txt. Also I create the table in my DB.
BULK INSERT CSVTest
FROM ‘c:\Test.txt’
WITH
(
FIELDTERMINATOR = ‘\t’,
ROWTERMINATOR = ‘\n’
)
GO
When I run this …am getting error saying…
Cannot bulk load. The file “c:\Test.txt” does not exist.
infact, I have the file given in the path.
Thanks,
Might be this only work when the database is local on the system that contain the text file.
the path should be UNC path like:
‘\\server’s name\XXX\csv.txt’
instead of ‘c:\Test.txt’
Correct. Actually same thing happened with me. So, i copied my file to server location.
then, its work.
Make sure the given path & file name are correct or not. because i got the same error when i run that query. Then i change the file name as same as in the script (csvtest.txt)
and stored it in the same path.
@Siva
SQL Sever 2005 must have access rights to the file (“c:\Test.txt”). Make sure the file is accessible for the database account
How to import data from a Excel Sheet using “Bulk Insert” command?
You need OPENROWSET function
Refer this post
Will this method work for SQL Server Compact Edition?
Panel, Excellent Job, Congrats.
Please help me for this….
I have on line in .csv file as …
863807129312681,G052360310001
I want to do BulkInsert as….
Column1: 86380 71293 12681
Column2: ;
Columns3: G052360310001
please help me…..
Great stuff, codes perfect, helped me alot lot
hai dave,
i want to iterate the record from collection of records without using cursors. how can i achieve this
Hi,
What if my Table has 4 Columns and the CSV file has 3 Columns and I wanna the last column of the Table to be a variable?
Refer this post. This exactly describes what to do when you are in such a situation
THANK YOU!