This is a very common request recently – How to import CSV file into SQL Server? How to load CSV file into SQL Server Database Table? How to load comma delimited file into SQL Server? Let us see the solution in quick steps.
CSV stands for Comma Separated Values, sometimes also called Comma Delimited Values.
Create TestTable
USE TestData
GO
CREATEÂ TABLE CSVTest
(ID INT,
FirstName VARCHAR(40),
LastName VARCHAR(40),
BirthDate SMALLDATETIME)
GO
Create CSV file in drive C: with name sweetest. text with the following content. The location of the file is C:\csvtest.txt
1,James,Smith,19750101
2,Meggie,Smith,19790122
3,Robert,Smith,20071101
4,Alex,Smith,20040202

Now run following script to load all the data from CSV to database table. If there is any error in any row it will be not inserted but other rows will be inserted.
BULK
INSERT CSVTest
FROM 'c:\csvtest.txt'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
GO
--Check the content of the table.
SELECT *
FROM CSVTest
GO
--Drop the table to clean up database.
DROP TABLEÂ CSVTest
GO

Reference : Pinal Dave (https://blog.sqlauthority.com)




839 Comments. Leave new
BULK
INSERT raja_csvtest
FROM ‘\\192.168.73.47\D\IndusInd\rrr.csv’
WITH
(
FIELDTERMINATOR = ‘,’,
ROWTERMINATOR = ‘\n’
)
i used this code
and got this error
Msg 4861, Level 16, State 1, Line 2
Cannot bulk load because the file “\\192.168.73.47\D\IndusInd\rrr.csv” could not be opened. Operating system error code 53(The network path was not found.).
that ip is of mine machine and sqlserver at another machine
pls help .
Make sure that the file is accessible by the server
Hi
thanks for this and its greatest help ever, keep it up.
Ok, So I am getting the following error, which seems odd.
Msg 4864, Level 16, State 1, Line 3
Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 129, column 4 (BirthDay).
Now my csv is formatted in the same way yours is I have added 1 field:
1,Aaron,Aaberg,19700926,MALE
But if I do the following:
INSERT INTO Test.BulkInsert (FirstName, LastName, Birthday, Gender)
VALUES (‘Test’, ‘Me’, 19851118, ‘Male’)
Works fine? What gives? I have tried everything under the sun. If I use INT as the data type for Birthday import works fine. I have tried to then CAST and or CONVERT the in and I get an arithmetic overflow obviously.
What is the datatype of birthday?
H~23500~I~16052011~002~16052011095015
D~1~1202350000205178~INE040H01021~751504~500.000~102~13052011124514~16052011000000~S~~291~1110001112029~~~16052011090150~0~~~~0~0~~~~N~0~~500.000~0.000~0.000~0.000~2~30~2215~0~829803/9009867~~~DPADM~2~16052011090156~~~~~~~
D~1~1202350000002200~INE040H01021~751482~500.000~102~13052011124514~16052011000000~S~~291~1110001112029~~~16052011090150~0~~~~0~0~~~~N~0~~500.000~0.000~0.000~0.000~2~30~2215~0~829854/273427~~~DPADM~2~16052011090156~~~~~~~
D~1~1202350000230405~INE040H01021~751384~100.000~102~13052011124514~16052011000000~S~~291~1110001112029~~~16052011090149~0~~~~0~0~~~~N~0~~100.000~0.000~0.000~0.000~2~30~2215~0~829765/9009867~~~DPADM~2~16052011090156~~~~~~~
D~1~1202350000222453~INE040H01021~751362~200.000~102~13052011124514~16052011000000~S~~291~1110001112029~~~16052011090149~0~~~~0~0~~~~N~0~~200.000~0.000~0.000~0.000~2~30~2215~0~829743/9009867~~~DPADM~2~16052011090156~~~~~~~
D~1~1202350000176665~INE275A01028~751434~25.000~102~13052011124514~16052011000000~S~~291~1110001112029~~~16052011090146~0~~~~0~0~~~~N~0~~25.000~0.000~0.000~0.000~2~30~2215~0~829830/9009867~~~DPADM~2~16052011090159~~~~~~~
D~1~1202350000167614~INE275A01028~751413~50.000~102~13052011124514~16052011000000~S~~291~1110001112029~~~16052011090146~0~~~~0~0~~~~N~0~~50.000~0.000~0.000~0.000~2~30~2215~0~829794/9009867~~~DPADM~2~16052011090159~~~~~~~
D~1~1202350000091621~INE275A01028~751408~50.000~102~13052011124514~16052011000000~S~~291~1110001112029~~~16052011090146~0~~~~0~0~~~~N~0~~50.000~0.000~0.000~0.000~2~30~2215~0~829789/9009867~~~DPADM~2~16052011090159~~~~~~~
D~1~1202350000169704~INE275A01028~751506~20.000~102~13052011124514~16052011000000~S~~291~1110001112029~~~16052011090146~0~~~~0~0~~~~N~0~~20.000~0.000~0.000~0.000~2~30~2215~0~829805/9009867~~~DPADM~2~16052011090159~~~~~~~
D~1~1202350000140254~INE275A01028~751350~100.000~102~13052011124514~16052011000000~S~~291~1110001112029~~~16052011090146~0~~~~0~0~~~~N~0~~100.000~0.000~0.000~0.000~2~30~2215~0~829731/9009867~~~DPADM~2~16052011090159~~~~~~~
D~1~1202350000118127~INE015A01028~751518~20.000~102~13052011124514~16052011000000~S~~291~1110001112029~~~16052011090200~0~~~~0~0~~~~N~0~~20.000~0.000~0.000~0.000~2~30~2215~0~829817/9009867~~~DPADM~2~16052011090203~~~~~~~
D~1~1202350000000028~INE124G01033~751526~300.000~102~13052011124623~16052011000000~S~~291~1110001112029~~~16052011090200~0~~~~0~0~~~~N~0~~300.000~0.000~0.000~0.000~2~30~2215~0~829858/272746~~~DPADM~2~16052011090204~~~~~~~
D~1~1202350000226667~INE124G01033~751414~100.000~102~13052011124514~16052011000000~S~~291~1110001112029~~~16052011090200~0~~~~0~0~~~~N~0~~100.000~0.000~0.000~0.000~2~30~2215~0~829795/9009867~~~DPADM~2~16052011090204~~~~~~~
D~1~1202350000000028~INE124G01033~751527~250.000~102~13052011124623~16052011000000~S~~291~1110001112029~~~16052011090200~0~~~~0~0~~~~N~0~~250.000~0.000~0.000~0.000~2~30~2215~0~829858/272746~~~DPADM~2~16052011090204~~~~~~~
D~1~1202350000002200~INE330H01018~751480~1.000~102~13052011124514~16052011000000~S~~291~1110001112029~~~16052011090201~0~~~~0~0~~~~N~0~~1.000~0.000~0.000~0.000~2~30~2215~0~829854/273427~~~DPADM~2~16052011090204~~~~~~~
D~4~1202350000000028~INE111B01023~851339~102000.000~409~16052011000000~16052011000000~~1100001000013802~291~1110001112030~~~16052011000000~0~~~~0~0~~~~~0~~0.000~0.000~0.000~0.000~1~79~2277~0~900-9868~~~DPADM~1~16052011092010~~~~~~~
D~1~1202350000245006~INE775B01025~851360~50.000~101~16052011092800~16052011092800~S~~291~1110001112029~~~~0~~~~0~0~~~~N~0~~0.000~0.000~0.000~0.000~0~0~4449~0~829872/305811~~~DPADM~2~16052011092800~~~~~~~
T~187~16052011~16052011095015
where H is for header…D for Details…T for summary…I have to create a stored procedure to import a file which is in above format….how to import it …plz help me….Thanx in advance…
thanks for sharing…that all realy very helpful….
Hi pinal,i am a developer from srilanka. thanx for this sql code. i tried to upload two csv files(with different data with same CSV structure) to a table.but first time it worked and second time it gave me this error,
“Violation of PRIMARY KEY constraint ‘PK__Summarised_Tradi__7C8480AE’. Cannot insert duplicate key in object ‘dbo.Summarised_Trading_Statics_Monthly'”
i want to know this because i have daily files(CSV files) to import to my table.plz help me. thank you pinal.
As error says, it is not possible to add duplicate data to the table. You may need to update data. Please be clear on what you want to do
hi.. ho if i wnt to exclude some sql column value??
for ex:
the sql table is like this
no|name|tlf
—————
if there is no value for tlf the csv file will be like what?
1,john,738283
2,carla,NULL
3.alex,838384
is it right?
Yes. If the column is NULL, absence of value will result to NULL
Hi ,
Please help me out as it is very usgent issue and i am taking long time to solve it.
I want to import all records from excel or csv file into the table however i dont want to create table before as my requirment is column should be same as excel file columns.
it should be somthing like “select * into tblname from csv file”
Please answer as soon as possible as it is higly urgent.
Thanks in advance.
Refer this post
Really helped me. Thank you!
nice one its very usefull
Hi Pinal,
Is there any way to create table dynamically from BULK INSERT statement?
Good Article
I am trying to import employees attendance (.csv) data to a table using phpmyadmin. My code is like below but I found the following error “#1060 – Duplicate column name ‘In’ ”
CREATE TABLE IF NOT EXISTS `CSV_DB`.`TBL_NAME` (
`No.` int( 3 ) ,
`User ID` varchar( 4 ) ,
`Name` varchar( 21 ) ,
`Weekday` varchar( 9 ) ,
`Date` varchar( 10 ) ,
`In` varchar( 8 ) ,
`Out` varchar( 8 ) ,
`In` varchar( 8 ) ,
`Out` varchar( 8 ) ,
`Total Working Hours` decimal( 5, 2 )
) ENGINE = MYISAM DEFAULT CHARACTER SET utf8 COLLATE utf8_general_ci;
can anybody help me to insert those ‘In’,’Out’ value several times to a table.
You can not use same column name again in the table
@madhivanan thanks for comments, but I want to track all the (In,Out) time. How I can do this without same column name? Can you give me some idea?
You have to use single in and out columns and add new rows for in and out details
Can I change the column name in the csv file on insert? Meaning if I did a normal bulk insert and one of the columns were called fName could I change it to FirstName so it would mapped correctly to my current table
It does not matter. Only data will be copied to the table
Excellent knowledge shraing
In this Program i Execute the query But It display this type Of Message
Msg 4834, Level 16, State 4, Line 8
You do not have permission to use the bulk load statement.
How can i get Permission in the Bulk Upload .
You need to enable BulkAdmin role for this login. Do it unser properties of securities–>login–>username
inside one folder include n number of excel sheet .
is it possible using SQL BULK insert to iterate through each exl sheet
and insert data into one table.
You can use openrowset command with explicitely specifying the sheet name. or try at SSIS
helpful, thanks
Hi frndz
i want to bulk insert the data from TEXTAREA (the web page was developed by using c# with asp.net)
Plzzzzzz anyone help me……..
It is easy to do when you have data in a text file
Hello, I have a 2008 SQL Server with a 2007 Access ADP as the front end. If I import the csv data from Access then both Access and SQL can see and interact with the file. However, when I import it with the Bulk transaction from SQL, then Access doesnt recognize the file. I am certain the naming conventions are the same as the same table is overwritten from both sources. Does anyone know why Access cannot see the file if SQL does the import?
Make sure that the file has extension csv