I finshed a project where from flat file all the information is processed in a staging database and then from the staging it inserts/updates information into the production tables using stored procedures.
There was 2 options here since the file was small in size.
1. Mass update and insert in a single transaction
2. Row by Row processing using cursor and committing the transaction at each row level
We are using option 1 and it works fine with no issues with very less processing time though the error handling is hard.
Now there is a change to the file and amount of records coming in would be huge around half a million records each day.
Now Iam really confused.If I use the first option it may cause locking and blocking since its a mass update in one single transaction.
If I choose option 2 then the time taken to process these many records will be huge.
Is there any other way I can process these efficiently.Can I do it in batches If so how can I do it?..Please help me with this
There was 2 options here since the file was small in size.
1. Mass update and insert in a single transaction
2. Row by Row processing using cursor and committing the transaction at each row level
We are using option 1 and it works fine with no issues with very less processing time though the error handling is hard.
Now there is a change to the file and amount of records coming in would be huge around half a million records each day.
Now Iam really confused.If I use the first option it may cause locking and blocking since its a mass update in one single transaction.
If I choose option 2 then the time taken to process these many records will be huge.
Is there any other way I can process these efficiently.Can I do it in batches If so how can I do it?..Please help me with this