Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Mike Lewis on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

BCP(Bulk Copy Program) is failing for a large file

Status
Not open for further replies.

VIJSQL01

Programmer
Oct 30, 2010
74
0
0
US
Hi,

I am facing following issue, when i tried to transfer large volume of data into a SQL(2008) table by using BCP utility.
The file format is .CSV and the amount data in it is 180 megs.

Code:
Msg 4863, Level 16, State 1, Line 1
Bulk load data conversion error (truncation) for row 2081, column 18 (SEASON_ADJ_MIDDLE_HIGH_BREAK_IN_1K).
Msg 4863, Level 16, State 1, Line 1

BCP Command
Code:
BULK INSERT TABLE A
        FROM '\\10.48.164.48\Datamove1\REAS\Rogue Index\CLREAS_2013Q2\csv format\CLREAS_MTH.csv'
            WITH
    (
                FIELDTERMINATOR = ',',
                ROWTERMINATOR = '\n')

I tried following ways in order to fix this issue.

1. Converted the data type char to varchar - Result: NO USE
2. Earlier file didnt have header and added the header and tried - Result: NO USE

Please note that, in my file, there is no data in the last column.Is this causing a problem?

Can anybody please help me, where am i doing wrong?

Thanks in advance
 
I have seen similar errors before. It might be that what ever is in that field (2081, column 18) is to long for the field you are tring to insert it into.

At 180 MB you might be able to get a txt editor for large files open it and check the that line.

Otherwise if you do any programming you might be able to write a program to to open the file and show you the contents of that line.

Simi
 
You can specify an errorfile

Code:
BULK INSERT TABLE A
        FROM '\\10.48.164.48\Datamove1\REAS\Rogue Index\CLREAS_2013Q2\csv format\CLREAS_MTH.csv'
            WITH
    (
                FIELDTERMINATOR = ',',
                ROWTERMINATOR = '\n',
                errorfile = 'C:\Errors.txt')
 
Make your destiantion table all fieds varchar(300) or something like that and load file all cleaning and transformations you can do after that
 
What gk53 said, and query MAX(len) on each field after that. Compare the results to your current table schema to find the mismatches (i.e. field definition too short for the value).

-----------
With business clients like mine, you'd be better off herding cats.
 
When we have issues with BCP and truncation or data conversion issues, we make a staging table with the columns being VARCHAR(MAX). We BCP to the staging table, then review all the data to see where there might be issues. Finding the MAX Length of a column will help with truncation issues. Then once we know which rows will cause a problem, we create the solution and insert the data from the staging table to the real table.

-SQLBill

The following is part of my signature block and is only intended to be informational.
Posting advice: FAQ481-4875
 
I presonaly do not like to use BCP to load data. I preffer to set ssis package to load data in thiis case you can validate your input and even load only good records and move "bad" records to separate location and actually it is works mutch faster
 
Agreed! BCP is quite old methodology and it doesn't work all the time. Also, thanks very much for providing all these suggestions. In fact, like one of our friend has suggested I figured out the maximum length of the string and accordingly adjusted the data size and it worked fine.

But, I have few questions regarding SSIS packages. Those questions are,
a. Can we really create SSIS packages on the fly?
b. Does this have any limitations?
c. There would be any performance issues, while loading huge amount of data?


 
Create ssis package on the fly? Simple package is not more complicated then BCP, but you need to have tools usially vs2008 which comes with SQL server 2008
Limitations? It has plenty of tools boild in.. you can even write your own vb or c# code and run against data.
Performance is better then BCP. I just set up job which load to data files about 300,000 records each, validate data format, convert some data from mainframe to SQl format, updates tables with about 15,000,000 records each and it takes about 2 min. I use to have BCP package and execute number of SQL commands after that and process time was 25 minutes. The big advantage you can run multiple processes in parallel and synchronize them, you can validate data before load, you can load only valid data and create output file with bad records and etc
 
BCP may be old, but is still in use. Believe it or not, in the right circumstances SSIS uses it under the covers.

-----------
With business clients like mine, you'd be better off herding cats.
 
I know that, but my point with BCP you have no any control on process, ssis give you tool to validate and massage data at time of loading... and have logic to find bad records if any and make a decision to process or not data depend from data audit results
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top