cj92713696
Programmer
I've written a VB 6 application that checks for the existence of a simple ASCII-based transaction files in a specific hard-coded directory and subsequently parses each file it encounters. After the file is parsed, it is moved to a "processed" directory.
My ASCII-based files look something like this:
Start->TableName=Table1
"Field1", "Field2", "StoreCode"
1,"Jack",AIR
2,"Bill",AIR
3,"John",AIR
End->TableName=Table1
Start->TableName=Table2
"Field1", "Field2", "Field3"
1,"Jack","4.00"
2,"Bill","5.00"
End->TableName=Table2
... etc. I might have 25 table's data in one transaction file. After I read each table identifier to determine the table being updated, I issue a delete request. This saves me the headache of having to calculate the changes to the table since the last update b/c I can just subsequently append the data read from this transaction file back to the SQl Server table.
My resolution to the problem: I had to issue a 10 second pause after issuing a delete request to each table to ensure the records were updated properly. Without this pause, it seemed that SQL wasn't finished deleting when I began appending new records to the same table.
My problem: I want to speed up this process b/c as you can see delaying 10 seconds after each table can cause heavy delays when 50 transactions are in the queue.
Is there a way (or a function) to determine if SQL is ready to handle table appends? Or perhaps a way to determine that the last command issued is complete?
Thanks for your help,
CJ
My ASCII-based files look something like this:
Start->TableName=Table1
"Field1", "Field2", "StoreCode"
1,"Jack",AIR
2,"Bill",AIR
3,"John",AIR
End->TableName=Table1
Start->TableName=Table2
"Field1", "Field2", "Field3"
1,"Jack","4.00"
2,"Bill","5.00"
End->TableName=Table2
... etc. I might have 25 table's data in one transaction file. After I read each table identifier to determine the table being updated, I issue a delete request. This saves me the headache of having to calculate the changes to the table since the last update b/c I can just subsequently append the data read from this transaction file back to the SQl Server table.
My resolution to the problem: I had to issue a 10 second pause after issuing a delete request to each table to ensure the records were updated properly. Without this pause, it seemed that SQL wasn't finished deleting when I began appending new records to the same table.
My problem: I want to speed up this process b/c as you can see delaying 10 seconds after each table can cause heavy delays when 50 transactions are in the queue.
Is there a way (or a function) to determine if SQL is ready to handle table appends? Or perhaps a way to determine that the last command issued is complete?
Thanks for your help,
CJ