Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations strongm on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Bulk Import Issues with large files 2

Status
Not open for further replies.

ehenry

Programmer
Sep 18, 2009
65
US
I am attempting to bulk insert from a CSV file to a table. The CSV is a 6GB file with 5 columns, I am unsure of how many rows are in the data, but it is over 10 million. Any Bulk Insert runs into an error "Attempting to grow LOB beyond maximum allowed size" Is there any other way to import this file aside from splitting it into multiple files? I don't want to write a program to loop through every row in the file and create multiple smaller files, and I don't want to write a program that does millions of insert statements. Any help would be appreciated. Thanks.
 
From what I was able to learn, LOB refers to Large Object data types.

I think your error revers to the size of the data going into one of your fields and not the lenght of your imput file.

Do you have a large text or image field that is being populated?

Perhaps it is not large enough.

What are your field types and lengths?

Simi
 
Your error message indicates that you are trying to store more than 2 gigabytes of data in a single row. Since I doubt this is your intention, my best guess is that you have a problem with your file layout.

For example, if you assume that the data is separated by comma, but it's actually separated by the TAB character, SQL will keep looking for the comma to end the column. If it doesn't find the comma, it will attempt to put the entire file in a to single column.

I would encourage you to grab the first dozen or so rows from the file and attempt to import it. You will probably notice that even a dozen rows do not import correctly. Fix that problem and you will likely fix the problem with the 6GB file too.


-George
Microsoft SQL Server MVP
My Blogs
SQLCop
"The great things about standards is that there are so many to choose from." - Fortune Cookie Wisdom
 
There are 5 columns, ID, Date, and 3 columns with a decimal value. No large fields. I think in this instance each column is treated as a LOB, meaning the column object would have a max size of 2GB??
 
Can you post the first 10 lines from the file and the code you are using to bulk load it?

-George
Microsoft SQL Server MVP
My Blogs
SQLCop
"The great things about standards is that there are so many to choose from." - Fortune Cookie Wisdom
 
221997,1973-12-01,0,2298,0,1
221997,1974-01-01,0,881,0,1
221997,1974-02-01,0,322,0,1
221997,1974-03-01,0,1036,0,1
221997,1974-04-01,0,591,0,1
221997,1974-05-01,0,500,0,1
221997,1974-06-01,0,527,0,1
221997,1974-07-01,0,508,0,1
221997,1974-08-01,0,448,0,1
221997,1974-09-01,0,467,0,1
221997,1974-10-01,0,1105,0,1
 
I have a table with 6 columns all varchar(max). Using a straight bulk insert with default parameters/
 
I wrote a quick program to read through the first lines of the file and write them to a new file, which I posted above. When I try to bulk import that file I don't get any errors, but it says 0 rows imported.
 
Using the import wizard works fine on the small file and imports all rows.
 
Its just

BULK INSERT tblName
FROM 'filename'

The file is on the server, so its a regular directory path.
 
The default field terminator during a bulk insert is the tab character. According to Books On Line:

FIELDTERMINATOR = 'field_terminator'
Specifies the field terminator to be used for char and widechar data files. The default field terminator is \t (tab character). For more information, see Specifying Field and Row Terminators.

Since you are using a comma, you must include the field terminator, like this:

Code:
BULK INSERT tblName 
FROM 'filename'
WITH (FIELDTERMINATOR=',')

You may still have problems with the row terminator, but I suggest you try importing the small file with the SQL I show above. If it works, you're done. If it doesn't, then we probably need to look at the row terminator.


-George
Microsoft SQL Server MVP
My Blogs
SQLCop
"The great things about standards is that there are so many to choose from." - Fortune Cookie Wisdom
 
Added row terminator '\n' to the bulk insert and worked with the smaller file. Seems to be working with the larger file as well. I thought the default would have worked.
 
Yeah I added both field and row and seems to be working....my mind slipped I was thinking ',' was default for field. Thanks for the help guys.
 
No problem. Glad you got it working.

-George
Microsoft SQL Server MVP
My Blogs
SQLCop
"The great things about standards is that there are so many to choose from." - Fortune Cookie Wisdom
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top