cricketer1
Technical User
I have a huge fixed length text file (1500bytes and over a million records). I am parsing and uploading this file into a 1st level stage table and from there into some normalised tables. The upload to the 1st level stage table goes fine, but when I run my stored procedures to transfer from stage to other tables then I encounter performance issues. These scripts are using cursors and stored procedures....whenever I run these scripts the transaction log blows up in size to over 5 gigs and after about an hour into the query I get a windows message saying "due to constraint on system resources some records will not be committed"
My question is what is the best way to deal with uploading bulk loads of data....how can i make this process more efficient and less time consuming, what are the best ways to deal with cursors?
Thanks
cricketer1
My question is what is the best way to deal with uploading bulk loads of data....how can i make this process more efficient and less time consuming, what are the best ways to deal with cursors?
Thanks
cricketer1