This is a very common request recently – How to import CSV file into SQL Server? How to load CSV file into SQL Server Database Table? How to load comma delimited file into SQL Server? Let us see the solution in quick steps.
CSV stands for Comma Separated Values, sometimes also called Comma Delimited Values.
Create TestTable
USE TestData
GO
CREATE TABLE CSVTest
(ID INT,
FirstName VARCHAR(40),
LastName VARCHAR(40),
BirthDate SMALLDATETIME)
GO
Create CSV file in drive C: with name sweetest. text with the following content. The location of the file is C:\csvtest.txt
1,James,Smith,19750101
2,Meggie,Smith,19790122
3,Robert,Smith,20071101
4,Alex,Smith,20040202

Now run following script to load all the data from CSV to database table. If there is any error in any row it will be not inserted but other rows will be inserted.
BULK
INSERT CSVTest
FROM 'c:\csvtest.txt'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
GO
--Check the content of the table.
SELECT *
FROM CSVTest
GO
--Drop the table to clean up database.
DROP TABLE CSVTest
GO

Reference : Pinal Dave (https://blog.sqlauthority.com)




839 Comments. Leave new
Hey there, thanks for the great article.
I have one question, is it possible to read a CSV file line-by-line without using bulk import?
It’s just more convenient in my situation so please let me know if its a possible option,
Thank you
Can you give us more informations on what you are trying to do?
Please help me for this….
I have on line in .csv file as …
863807129312681,G052360310001
I want to do BulkInsert as….
Column1: 86380 71293 12681
Column2: ;
Columns3: G052360310001
please help me…..
Do you mean to skip column 2?
Refer this post
i want to get data from CSv file into sql database.suppose the CSV file changes that is updations will be done on CSV file for every 8 hrs or 4 hrs like that.Then i have to get the updations reflected in my sql datbase also. Then how it is possible ????
you never answered scott’ question above. thoughts?
his question:
A CSV file can contain commas in a field ‘,’ as well as be used as the delimiter. How does this Bulk import handle these circumstances? Does it handle all the CSV standards?
@vonPryz: See Example Below (don’t just replace quotes)
Example CSV lines with format (Name,Address):
Scott,123 Main St
Mike,”456 2nd St, Apt 5″
I am using SQL Express Edition to Run the code..
But I am Getting the error
Msg 102, Level 15, State 1, Line 3
Incorrect syntax near ‘‘’.
Msg 319, Level 15, State 1, Line 4
Incorrect syntax near the keyword ‘with’. If this statement is a common table expression or an xmlnamespaces clause, the previous statement must be terminated with a semicolon.
Please help me to resolve the error…
Thanx in advance…
It is becuase single quotes are represented differently in this site. Change it single quote and try
What happens to the rows that fail? I wuold like to save those off to allow the user to fix them….
You can make use the method that is explained in this link
to different scott
I have fields with commas as well. Convert the csv (using Excel) to a tab-delimited text file and then use the ‘\t’ delimiter instead.
to Abdul
I have used Excel to break longer fixed-length columns into smaller columns. I believe this will help you. Open the file in Excel, add two columns after the first column. use the Data –> Text to Columns option. you can break the data into additional columns and then save the file back out to a csv.
i have one csv file in that file the data is like
CATEGORY,10,10,10,10
ERP Code,Creation date (DD/MM/YYYY),NoProgram,Contract number,Contract position
920,4/8/2010,971156,12914,1
LINES,10,10,10,10
Date (DD/MM/YYYY),Quantity,Horizon,10,10
16/09/2010,20,Frozen,10,10
28/10/2010,20,Flexible,10,10
23/12/2010,11,Flexible,10,10
27/01/2011,13,Flexible,10,10
24/02/2011,14,Flexible,10,10
24/03/2011,19,Flexible,10,10
28/04/2011,13,Flexible,10,10
26/05/2011,13,Flexible,10,10
23/06/2011,16,Flexible,10,10
28/07/2011,23,Flexible,10,10
1/8/2011,13,Forecast,10,10
29/08/2011,7,Forecast,10,10
24/10/2011,19,Forecast,10,10
21/11/2011,14,Forecast,10,10
19/12/2011,13,Forecast,10,10
16/01/2012,18,Forecast,10,10
13/02/2012,12,Forecast,10,10
12/3/2012,12,Forecast,10,10
9/4/2012,19,Forecast,10,10
7/5/2012,16,Forecast,10,10
4/6/2012,14,Forecast,10,10
2/7/2012,17,Forecast,10,10
30/07/2012,12,Forecast,10,10
24/09/2012,8,Forecast,10,10
like this give a solution to insert this one to one sql table
i tryed by using bulk insert but it gettting updated with one column i neeed field by field
plse help me
I have over 1,200 separate csv files to import and have two issues where I could use some help.
First, all of the files contain date fields but they are in the standard MM/DD/YYYY format. Is there a way to import them and then change them to an acceptable format for SQL Server?
Next, each file name is unique and I have created a temporary table that has the directory and file name. Is it possible to nest a query something like this:
BULK INSERT MyTable
FROM (Select FileName from Files;)
WITH
(
FIELDTERMINATOR = ‘,’,
ROWTERMINATOR = ‘\n’
)
GO
As an addendum, I discovered that some columns are in scientific notation so I need to change all numeric columns in the CSVs to be numeric with 6 decimal
Did it. I used a combination of tools:
1. Lots of brute force – over 20 hours formatting, saving and converting the data. There has to be a more elegant solution but I couldn’t find it.
2. Excel 2007 because it can open a csv file that has over 200,000 rows. I loaded 25 files at a time into a worksheet (max of computer resources), reformatted the columns in bulk and used a nifty VBA script to write each worksheet out to a separate csv with the name of the worksheet.
3. Found a great procedure that uses XP_CMDSHELL and the BCP utility in SQL 2005 that loads all files located in a specified directory. Loaded over 1,300 files in less than 30 minutes.
Now daily maintenance is loading one file a day.
D.
im getting the foll err can u help
ulk Insert fails. Column is too long in the data file for row 1, column 1. Make sure the field terminator and row terminator are specified correctly.
Check the length of the first column of the table
You need to increase it as the incoming data length is more than column’s length
Hi,
Does BULK INSERTing into an existing table that has an index on it take longer than inserting into an existing table without an index?
Cheers
If the table has clustered index, it may take more time to import
Hi Pinal,
I was wondering how will i insert thousands of row from excel sheet.
But I converted the file to csv (comma seperate file ) and then usiing your bulk import statement i created all of my rows into my Table.
This was really wonderful
Thanks,
Keep it up
Happy Coding!!!
Puneet
I’m doing exactly what is said above but when I try the
BULK
INSERT CSVTest
FROM ‘c:\csvtest.txt’
WITH
(
FIELDTERMINATOR = ‘,’,
ROWTERMINATOR = ‘\n’
)
GO
I’m recieving the following error??? I was wondering could someone help me with this or is it just some tiny error that I can’t see, help would be really appreciated!!
Msg 102, Level 15, State 1, Line 3
Incorrect syntax near ‘‘’.
Msg 319, Level 15, State 1, Line 4
Incorrect syntax near the keyword ‘with’. If this statement is a common table expression or an xmlnamespaces clause, the previous statement must be terminated with a semicolon.
It is because single quotes are represented differently in this blog. Change them to single quotes and try
Hi,
I was using BULK insert to import a csv file to a database table(following the original example by Pinaldave in the beginning of this email stream), but got errors. I have granted “Everyone” full control to this file.
Thanks so much,
Sheldon
============================================
BULK INSERT dbo.CSVTest
FROM ‘c:\csvtest.csv’
WITH
(
FIELDTERMINATOR = ‘,’,
ROWTERMINATOR = ‘\n’
)
GO
Msg 4860, Level 16, State 1, Line 1
Cannot bulk load. The file “c:\csvtest.csv” does not exist.
please try to change the data-type of the destinations column.
regards,
Ahmad Elayyan
The file should exists in the server and not in your system
i have a text file that is not comma or tab delimited. the data needs to be separated using fixed width counts. ie: characters 1-3 go into column1. characters 4-12 go into column 2 and so on. can this method be adapted to handle pos statements?
Hi,
I Just want to Thank’s for this Code realy help me a lot.
and i want to how to inport a .csv file through imprt command.
I ran code in sql 2000 worked fine. During cut and paste it has problem with ‘ ‘. I had to re-enter from my keyboard. Also like to suggest for those it doesn’t run, please name extension .csv in than .txt for sql 2000.
Thanks it works!
Also just wanted to point out that the query you said uses ” ` ” instead of ” ‘ ” shouldn’t you change it?
Hi Pinal
Please help me
Iam using sql server 2000,..I want to do a similar operation..
I have my file in C:\Test.txt. Also I create the table in my DB.
BULK INSERT CSVTest
FROM ‘c:\Test.txt’
WITH
(
FIELDTERMINATOR = ‘\t’,
ROWTERMINATOR = ‘\n’
)
GO
When I run this …am getting error saying…
Cannot bulk load. The file “c:\Test.txt” does not exist.
infact, I have the file given in the path.
Thanks,
As I replied to other person, the file should exists in the server and not in your system