I'm trying to import into a db a file that is 19 million records big.
Each record consists of 20 fields, each nvarchar of 20 in length.
I have 115 gigs free space on the drive for the data file, and 115 gigs on the log drive.
The log and data are on separate drives.
When I DTS the file into the DB, at the end of the transfer, after it has moved all 19 million records, it then gives me an error message of "Primary File Group full".
Notice that it doesn't do this in the middle of the transfer, but at the very end of it.
I am told that the problem is the tempDB, that it isn't expanding fast enough.
How do I get this to work?
mongr1l
Each record consists of 20 fields, each nvarchar of 20 in length.
I have 115 gigs free space on the drive for the data file, and 115 gigs on the log drive.
The log and data are on separate drives.
When I DTS the file into the DB, at the end of the transfer, after it has moved all 19 million records, it then gives me an error message of "Primary File Group full".
Notice that it doesn't do this in the middle of the transfer, but at the very end of it.
I am told that the problem is the tempDB, that it isn't expanding fast enough.
How do I get this to work?
mongr1l