Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Mike Lewis on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Batch processing into SQL Server, with errors

Status
Not open for further replies.

SteveDingle

Programmer
Jul 26, 2004
254
0
0
GB
Heya All,

I am currenty running processes which inserts/updates/deletes records into SQL Server 2K from VFP 9.

Since I was logging success/failure I was committing each record as opposed to batch processing. Client is now saying the process is taking to long so I was going to move to batch processing however I am not sure how this would effect my ability to capture the success/failure on an individual record basis. So I have a coupla questions:

1) Will batch processing improve the speed?

2) If I commit say 50 changes, and 1 or 5 fails, what will happen? Will I be able to get record details of which ones failed?

Toodles,
Steve Dingle
D&S Business Solutions Ltd
 
Hi Steve,

What exactly do you mean by "batch processing"? Are you talking about adjusting the batch count property within VFP? Or ARE you thinking of something like a prepared query (using SQLPREPARE())? Or do you mean that you simply want to send a whole bunch of commands in one SQLEXEC()?

Mike


__________________________________
Mike Lewis (Edinburgh, Scotland)

My Visual FoxPro site: www.ml-consult.co.uk
 
Heya Mike,

Hope all is going well.

>> What exactly do you mean by "batch processing"?

Well ya got me there. :) To be honest I think I was thinking there was someway I could process more than a single insert/update. Something like a series of SQLExec commands then after 50 or so send a SQLCommit. I suppose I coulda investigated more before posting but figured ya'll would know right off and save me going in wrong direction.

At the end of the day I need to process about 100K records each day which will include updates/inserts/deletes into over 100 tables and I need to get the most from the SQL server side of things

Thanks in advance for any thoughts

Toodles,
Steve Dingle
D&S Business Solutions Ltd
 
Hi Steve

>> To be honest I think I was thinking there was someway I could process more than a single insert/update

There are several options.

First, you will certainly improve your performance if you don't commit each insert as you do it.
irst, you must set your connection to use "manual" transactions (use SQLSETPROP()). This will ensure that when you do an insert, the transaction remains open.

Then run your inserts as usual (I assume you have implicit transactoins enabled, so the first insert will start a transaction). After a number (I usually do about 500) issue a SQLCOMMIT() to close the transaction. Thew next insert will start another transaction, and so on.

Experiment with the number of inserts before you commit and you will find that you can speed up your insert significantly.

As for error handling, check the result of the insert command and if you get an error, roll back the current transaction - you can use code along thesae lines to handle this:

lnRes = SQLEXEC( nCon, lcInsert )
IF lnRes < 1
AERROR( laErr )
MESSAGEBOX( laErr[2]
SQLROLLBACK( nCon )
ENDIF

If this is still not fast enough, then you will need to look at doing a bulk insert. To do this you have to use the BCP utility and generate your data in a specially formatted text file. This is very fast but is very difficult to trap errors (it is a bulk process and either succeeds or fails). But I wouldn't expect you to need to go this route.


----
Andy Kramek
Visual FoxPro MVP
 
Steve,

Sounds like you need SQLPREPARE(). Essentially, you send your INSERT, UPDATE or DELETE once, via SQLPREPARE(), using parameters for the variable information.

You then call SQLEXEC(), as many times as necessary, but you just pass the connection handle. Each call will tell the back end to process the same INSERT, UPDATE or DELETE, substituting the new parameter information.

Here's a very quick example:

Code:
lc = 'UPDATE tours SET COST = 1000'+ ;
  'WHERE catalog=?up_cat'
? SQLPREPARE(nconn,lc)
up_cat = 'HIKEYO'
? SQLEXEC(nconn)
up_cat = 'CROCFL'
? SQLEXEC(nconn)

This is very much quicker than calling SQLEXEC() with the full command each time. However, each command is seen as a separate transaction, so it won't affect your ability to log individual failures or successes.

Does that answer your question?

Mike


__________________________________
Mike Lewis (Edinburgh, Scotland)

My Visual FoxPro site: www.ml-consult.co.uk
 
Heya Andy,

>> you can use code along thesae lines to handle this:

Cool, for some reason I had thought that constraint/trigger failures only happened when you committed, dunno where I got that from.

As for the SQLROLLBACK, since help says currently transaction, I assume if I have a problem with my 100th insert (before a commit) and I rollback it will rollback everything and I have to reprocess the other 99.

Thanks!
 
Heya Mike,

Your suggestion might be very useful. I'll have to think about it. I would say the common portion of my UPDATES would be "where pk field = ?pkValue" tho the list of fields being updated and the values would be completely different each time

Thanks!
 
>>As for the SQLROLLBACK, since help says currently transaction, I assume if I have a problem with my 100th insert (before a commit) and I rollback it will rollback everything and I have to reprocess the other 99.


Yes. It's a balance - the more records you commit in one transaction, the faster it goes, but the more re-processing you will need to do if something fails. Personally I prefer to validate the data BEFORE sending it to the server - IOW nothing should ever fail on the server becuase of bad input.

----
Andy Kramek
Visual FoxPro MVP
 
Thanks Andy,

>> Personally I prefer to validate the data
>> BEFORE sending it to the server

Agreed, currently working on functionality to capture assorted rules/constraints out of SQL Server to pre-test but don't have it fully implemented yet so... I do that for VFP data but still learning the real nitty-gritty stuff of SS

Toodles,
Steve
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top