Dear All,
I currently have a problem requiring that I insert approximately 2 million rows into a database, 1 record at a time. Now I know that inserting a single row at a time is very in-efficient approach to this problem, given that operations should be set based however, I do not have an alternative at this time.
I am currently inserting data into a table that has an identity column with a clustered index on it. Would I see a performance gain if I were to remove the clustered index and to then re-build the index after having inserted all the data?
Any assistance would be very much appreciated.
Many Thanks,
John
I currently have a problem requiring that I insert approximately 2 million rows into a database, 1 record at a time. Now I know that inserting a single row at a time is very in-efficient approach to this problem, given that operations should be set based however, I do not have an alternative at this time.
I am currently inserting data into a table that has an identity column with a clustered index on it. Would I see a performance gain if I were to remove the clustered index and to then re-build the index after having inserted all the data?
Any assistance would be very much appreciated.
Many Thanks,
John