Hi all,
Yes I know 'cursors are bad'...blah blah. Yet I have a set of data that's in a large table and the processing of this must take place in order--logic on what to do for example, record 100 absolutely depends on what record 99 was and what record 100 and 101 are. (This is just an example, it's slightly more complex than that).
So, I'm sure there's probably some guru who could come up with a single SQL statement sans-cursor that is chock full of CASE and IF/AND/OR statements that may do the job but is completely unreadable by normal humans. (I applaud you, take your gold star, I am but a mere mortal). The process I have requires top-down procedural processing and my question is this:
The set of 'source' data is about 40 mil. rows. The individual sets data that are dependent upon each other themselves can be broken down into smaller sets of a 1000 records or so (based on the field values, keys, etc). So, is it more efficient to use a cursor, then do the logic row-by (yes...agonizing) row), or is the memory overhead, etc of a cursor so much that a 'hand-made' cursor would suffice--ie, Load the sets of data into a Table-Variable (or temp-table), with an identity key, then use a WHILE loop, looping on the Identity (or whatever ordered sequential row-id I choose)? I don't want to do the processing direct on the large table because of the various possibly multiple-reads, checking the previous row, etc--It just seems this would be more efficient to break it down into smaller chunks and that ripping back and forth through copied data in table variable would be better than reading that giant table back-and-forth.
Thanks very much,
--Jim
Yes I know 'cursors are bad'...blah blah. Yet I have a set of data that's in a large table and the processing of this must take place in order--logic on what to do for example, record 100 absolutely depends on what record 99 was and what record 100 and 101 are. (This is just an example, it's slightly more complex than that).
So, I'm sure there's probably some guru who could come up with a single SQL statement sans-cursor that is chock full of CASE and IF/AND/OR statements that may do the job but is completely unreadable by normal humans. (I applaud you, take your gold star, I am but a mere mortal). The process I have requires top-down procedural processing and my question is this:
The set of 'source' data is about 40 mil. rows. The individual sets data that are dependent upon each other themselves can be broken down into smaller sets of a 1000 records or so (based on the field values, keys, etc). So, is it more efficient to use a cursor, then do the logic row-by (yes...agonizing) row), or is the memory overhead, etc of a cursor so much that a 'hand-made' cursor would suffice--ie, Load the sets of data into a Table-Variable (or temp-table), with an identity key, then use a WHILE loop, looping on the Identity (or whatever ordered sequential row-id I choose)? I don't want to do the processing direct on the large table because of the various possibly multiple-reads, checking the previous row, etc--It just seems this would be more efficient to break it down into smaller chunks and that ripping back and forth through copied data in table variable would be better than reading that giant table back-and-forth.
Thanks very much,
--Jim