This is a very common request recently – How to import CSV file into SQL Server? How to load CSV file into SQL Server Database Table? How to load comma delimited file into SQL Server? Let us see the solution in quick steps.
CSV stands for Comma Separated Values, sometimes also called Comma Delimited Values.
Create TestTable
USE TestData
GO
CREATE TABLE CSVTest
(ID INT,
FirstName VARCHAR(40),
LastName VARCHAR(40),
BirthDate SMALLDATETIME)
GO
Create CSV file in drive C: with name sweetest. text with the following content. The location of the file is C:\csvtest.txt
1,James,Smith,19750101
2,Meggie,Smith,19790122
3,Robert,Smith,20071101
4,Alex,Smith,20040202

Now run following script to load all the data from CSV to database table. If there is any error in any row it will be not inserted but other rows will be inserted.
BULK
INSERT CSVTest
FROM 'c:\csvtest.txt'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
GO
--Check the content of the table.
SELECT *
FROM CSVTest
GO
--Drop the table to clean up database.
DROP TABLE CSVTest
GO

Reference : Pinal Dave (https://blog.sqlauthority.com)




839 Comments. Leave new
Hi Pinal
I have a text file with fixed length data values in each row with no delimiters. As the number of data rows are huge, i cannot convert it to a CSV. How do I import the data from the text file to the database table.
Table structure is
Number Char 16
Name Char 15
Amount Decimal 15,2
Ticker Char 3
date datetime 8
the txt file row is of the form
0001CAQMJ60010001-00265890-01Abraham Jones EC01-02-07 18449.09USD
desired result
Number = 0001-00265890-01
Name = Abraham Jones
Amount = 18449.09
Ticker = USD
date = 01-02-07
i want to use the following store procedure for bulk insert from csv to table , i have about 20 files and 20 tables
set ANSI_NULLS ON
–set QUOTED_IDENTIFIER OFF
GO
ALTER PROCEDURE [dbo].[ps_Bulk_Import]
@tableName varchar(10),
@PathFileName varchar(100)
AS
DECLARE @SQL varchar(2000)
BEGIN
SET @SQL = ‘BULK INSERT ‘”+@tableName+”‘ FROM ‘”+@PathFileName+”‘ WITH (FIELDTERMINATOR = ‘,’,ROWTERMINATOR = ‘\n’) ‘
END
EXEC (@SQL)
when i tried the above code it gives me error
Msg 102, Level 15, State 1, Line 1
Incorrect syntax near ‘Testtable’.
Msg 319, Level 15, State 1, Line 1
Incorrect syntax near the keyword ‘with’. If this statement is a common table expression or an xmlnamespaces clause, the previous statement must be terminated with a semicolon.
Any body help to achive this task.thanks
Shoukat
The part
SET @SQL = ‘BULK INSERT ‘”+@tableName+”‘ FROM ‘”+@PathFileName+”‘ WITH (FIELDTERMINATOR = ‘,’,ROWTERMINATOR = ‘n’) ‘
should be
SET @SQL = ‘BULK INSERT ‘”+@tableName+”‘ FROM ‘”+@PathFileName+”‘ WITH (FIELDTERMINATOR = ‘’,’’,ROWTERMINATOR = ‘‘n’’) ‘
Hi all
i want to export a CSV file in to a table excluding the file row . Could any one of you give me a saple query .
Thanks in advance
WITH (FIRSTROW = 2, FIELDTERMINATOR = ‘’,’’,ROWTERMINATOR = ‘‘\n’’) ‘
Hi all
I want to import a csv file in to a table , and i want to exclude the first row of the CSV file while importing .
can any one of you help me witha sample query .
Thanks in Advance
Use the option firstrow=2
Hi Madhivanan,
when i tried this
SET @SQL = ‘BULK INSERT ‘+@tableName+’ FROM ‘+@PathFileName+’ WITH (FIELDTERMINATOR = ”,”, ROWTERMINATOR = ”\n”)’
it seems good. i mean when i tried @tablename is test(which exist in my database) and PathFileName is temp(a file that does not exist). it give me error file dose not exist. but when i pass D:temp.csv then it give me error like
Msg 102, Level 15, State 1, Line 1
Incorrect syntax near ‘D’.
Msg 319, Level 15, State 1, Line 1
Incorrect syntax near the keyword ‘with’. If this statement is a common table expression or an xmlnamespaces clause, the previous statement must be terminated with a semicolon.
(1 row(s) affected)
i will be gratefull for your help. thanks
Shoukat
D:temp.csv
should be
D:temp.csv
Also the file should exist in the Server’s directory and not with the local system you are working with
Hi Pinalkumar,
Thanks for this awesome blog, great source of info.
I used your code to import data from a csv file which worked great, just need some help in automating this by using SQL Agent Job.
Code:
————————–
USE CSVTestData
GO
BULK
INSERT CSVTest
FROM ‘c:\Test\csvtest.txt’
WITH
(
FIELDTERMINATOR = ‘,’,
ROWTERMINATOR = ‘\n’
)
GO
—————————–
Questions:
– What the best way to automate the above code and how?
– How do I use SQL Job Agent to automate this, Can I create a stored procedure and put this code in there and then call (Schedule) it from SQL Job Agent?
Thanks much,
Warraich
Yes. Put that code in procedure and schedule that procedure as a job to execute periodically
Hi,
I’ve a 45 GB file and I am using bulk insert to import it. its working fine but taking alot of time(9-10 hours). Any ideas to improve its performance would be highly appreciated.
Thanks,
Kusum
Did the table have any indexes?
If so, try removing them and insert data. After that create indexes
no, no indexes, no constraints, nothing, its a heap table we created specially to import all the data.
–Drop the table to clean up database.
SELECT *
FROM CSVTest
GO
====================Q==================
Drop table syntax is not written in the given code…
This helped but when I run i next time it repeats the records, what query do i need to remove existing records.
Regards,
Khalid
You need to delete the existing data from the table before using bulk insert
What should i do to move table1 in my database which is at last position, how can i move it upward or place it upward among tables. Secondaly it will be highly appreciated if any one gives me a query, the scenerio is that when I create a table so it should get created even if it already exists, and drops the old table.
Regards.
Khalid
If exists(select * from information_schema.tables where table_name=’your_table’)
DROP table your_table
GO
CREATE Table your_table(
.
.
.
.
thanks pinal, that was helpful to me.
Hi ,
the solution works great however any suggestions if the data has comma part of it? the data is between two double quotes “”
example:
1, “a,b”
2,a
3,b
4,c
the condition “FIELDTERMINATOR = ‘,’,” causes the data “a,b” is being considered as having a third field
Hi pinal ,
Is it possible to import the xls file using BulkInsert?
Hi.
I have some txt large files that contains about 3milion records. I used Bulk and in about 2 minute one of them inserted into my table.
For that txt file format and my table format are difference, I create SP and in it,create a temp table(#t) and copied data by bulk into it, then inserted all in my table.
But from insert completed time to now(about 6hours), sqlserver process memory usage is very large(about 300MB) and also after restart get this amount of memory.Why?I drop temp table in end of SP.Why sql server memory usage is so big?!?!?! What do I do?
@Pooneh
Just my guess, try doing this.
Method 1: Create a Clustered Index on Temp table in the stored procedure before loading data into temp table.
Method 2: Instead of Temp Table, use Permanent (Staging) table and create Clustered Index on the table and see the performance impact.
Check your join Conditions if you have used any, if you think you can make join conditions columns Integer by using Convert (int, Column_name) could improve performance. If you are using any other table to join with temp table, check if indexes are created on other table.
Let us know if this was helpful. Or if you solved this issue with any other method, if so please share it here, so that we all can benefit with the information.
~ IM.
Hi
I have a .dat file with 700+ records; I am only getting 410 rows of data inserted. I have not noticed any difference in the format in the data files. What could be the problem?
Thanks,
Can you check data of line 411 to see if there is any mismatch?
Hi!
Excelent work, thnx! :)
But i have a problem.
I don’t append hungarian characters to the table.
:(
Anybody can help?
BR, M
Hi,
Thanks for your article,
Its very useful
Regards,
Venkata
Thank you for the article. Unfortunately, as my app’s database resides on a server that is also used by other apps from other developers, i dont have the right permissions.
From the microsoft website:
“Only members of the sysadmin and bulkadmin fixed server roles can execute BULK INSERT.”
https://www.microsoft.com/en-us/download/details.aspx?id=51958
Apparently the the ‘bulk admin’ role is a server wide role, so bulk access means bulk access to *every database*.
Thanks for the query.
I have a table with 6 columns, but csv with 5 columns, i need to save one default value to that column, is it possible in this statement.
for example if the file contais 50,000 line then the first 10,000 records have the value 1 and next 10,000 records have the value 2 and so on….
Thanks in advance.
Hi,
Thanks for the query.
I have a table with 6 columns, but csv with 5 columns, i need to save one default value to that column, is it possible in this statement.
for example if the file contains 50,000 line then the first 10,000 records have the value 1 and next 10,000 records have the value 2 and so on….
Thanks in advance..
Refer this post. This exactly does what you want