Here is the question received from user – the email was long and had multiple request from reader to resolve this error.
Scenario:
The user was trying to import data from Excel to tables in SQL Server Database. Every time he attempted to import the data he faced following error. He tried using SSIS package as well using the Import Export Wizard (which creates an SSIS package under the hood as well) but he he kept on facing following error. He could not figure out the reason behind the error. I have modified the error to make it readable.
Error:
– Executing (Error)
Messages
Error 0xc02020c5: Data Flow Task 1: Data conversion failed while converting column “Col1” (18) to column “Col1” (51). The conversion returned status value 2 and status text “The value could not be converted because of a potential loss of data.”.
(SQL Server Import and Export Wizard)
Error 0xc0209029: Data Flow Task 1: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. [Error Detail]. An error occurred on the specified object of the specified component. There may be error messages posted before this with more information about the failure.
(SQL Server Import and Export Wizard)
Error 0xc0047022: Data Flow Task 1: SSIS Error Code DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component “Data Conversion 0 – 0” (39) failed with error code 0xC0209029 while processing input “Data Conversion Input” (40). The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running. There may be error messages posted before this with more information about the failure.
(SQL Server Import and Export Wizard)
Solution:
Well, there can be many reasons for this error to show up but the major reason is that data type mismatch. I have observed that following two are the major reason for the error.
1) Null’s are not properly handled either in the destination database or during SSIS package creation. It is quite possible that the source contains a null database but the destination is not accepting the NULL data leading to build generate above error.
2) This is the most common issue I have seen in the industry where data types between source and destination does not match. For example, source column has varchar data and destination column have an int data type. This can easily generate above error. There are certain kind of datatypes which will automatically convert to another data type and will not generate the error but there are for incompatible datatypes which will generate above error.
The best practices is to make sure that you match your data types and their properties of the source and destination. If the datatypes of the source and destination are same, there is very less chance of occurrence of the above error.
Reference:Â Pinal Dave (https://blog.sqlauthority.com)
8 Comments. Leave new
hello sir i am facing one problem .
we are 5 people working in network .we have install Sql Server in individual Pc.
We are working with our Domain Login.when we login using Domain in OS . then After i access my Sql server database in windows authentication mode.
but when i am trying to access other user Database in windows authentication mode then i m also able to access it.. so i want to prevent my database to other so what i have to do..
Hi Pinal,
I got same error because of one particular column that was existing in few of the files but not in all. Actually I have to process few flat files to insert their data in one table in one database. But there is one particular column (LastPromotionDate) that may exist in few of the files but not in all. Now for files where it is existing, its value is to be inserted in corresponding column LastPromotionDate in the destination table. But for files where it is not existing, I have to use some variable value as input to the corresponding column LastPromotionDate in the destination table. Is it possible? If YES then how, which type of transformation to be used and how?
Regards
Sachit
Had this issue with importing data from a csv. Because I never had created the table in SQL but instead used the import to create table with same name as file name it kept the original column width settings of my first table. When I changed the column widths to suit the data the table retained the original settings to it kept failing.
Solved this by going into the table and setting all the columns with text rather than codes to 200 and it worked.
You could also delete the table, set the correct column widths in the import settings and then run it, that should work.
Kept having the same problem with a csv file, I changed it to a txt file and bingo it worked
Thanks for sharing the trick Jonathan.
I was getting this same issue, and it was related to the handling of null values. The issue was that my source Excel spreadsheet had empty rows that were being included in the import process. I got around it by using a custom query that specifically eliminated null values (eg. select * from `myTable$` where ID is not null). I had to do this for all tables I was importing.
I had the same issue and mine was because the first few rows of one of my columns had a null at the beginning. I resolved it by cutting some of the rows with not null values and pasting them on the top rows and that resolved my issues.
I had the same problem due to NULL values in some columns that were converting to a Numeric data type.
I went to the Flat File Source editor in my data flow.
Then I checked the box “Retain null values from the source as null values in the data flow”.
Ofcourse this will pass these null rows along to your table.
But once in your table, you can decide how you would like to handle them.
This solved my issue.
Thanks, Mark Killion