Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations sizbut on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Long SQL transactions in ADO client app not comitting fully.

Status
Not open for further replies.

Ruairi

Programmer
May 14, 2000
314
US
I originally posted this question in the SQL Server forum but as i do more testing on the problem it seems to be on the client end (an ADO application written in VB). I can't seem to make any headway tracking down this bug and am hoping someone has seen a similar situation. Below are the posts from the SQL forum. Through further testing i have discovered that if i step through the whole process in VB (Shift-F8) the problem does not occur. I am not stepping over any code, just executing each line one at a time in the original order. TIA for any help or thoughts.
----------------------------------------------------------
I am having problems with long transactions not comitting fully. I am working on an accounting system for sawmills. One of the functions of this system is to import data that is sent in the form of text files from scaling bureaus. These files contain information about loads and logs (each log record refers back to a load record via a foreign key) that the mill recieves from vendors. There are conversion tables in the system to relate text information (trucker names, logger names, species, etc..) to primary key values of the related tables. There can be errors (the system cannot find a conversion for a piece of information and does not have a default available) or warinings (the system does not have a conversion, but can set the field to a default value) during this import. The loads and logs are added to the DB inside a transaction. In the case of an Error the entire transaction is automatically rolled back. If there are warnings the user can choose to rool back the transaction or commit itbased on a report of the warnings. In the case of warnings but no errors there is always enough data to commit the transaction (all the foreign key values in the load and log tables are related to a valid record, although some may just be a default). The problem is that SQL Server seems to lose some of the transaction. I will issue the commit command and not all the loads will be in the database. I know the loads were all imported during the transaction because as they are imported i run a stored procedure on each load to set it's cost based on prices in the system. I am reading the connection's error object after each run of the Stored Procedure and printing any errors to the error/warning report so i know there are no errors occuring at the server as these loads are added. I started with a bureau file containing 88 loads. Whenever i would go through the process and chosse to commit the loads only the last 15 would be in the database. I then cut and pasted these loads into 6 seperate text files. The fist had 14 loads in it and all 14 were imported. The next had 19 loads but after comitting it to the database only 17 of these showed up. I cut the remaining 2 out of the text file and pasted them into their own file. After processing this file the 2 loads showed up. That tells me that there is not actually a problem with the data or the import would have failed when these 2 loads were imported alone. Am i missing some setting or settings on the server that would allow these transactions to fully commit? Any help would be greatly appreciated. I have been at this a long time trying to figure out what's wrong but everything i see points to SQL Server 'losing' part of the transactions. TIA
Ruairi
-----------------------------------------------------------
You refer to loads and logs. Do these come in separate files? You indicate that the logs relate to the loads via a foreign key. Is there a one-to-one or one-to-many relationship?

You also say that SQL fails to import some of the loads. Does it also fail to import some of the logs or all of the logs related to the loads not imported?

Is it possible that something was missing in one or more tables during the 1st import and that prevented the record load. And then perhaps the necessary piece was added during the 1st import which is the reason the 2nd import succeeded?

This is really a stretch. Is there any additional info that could help narrow the problem?
Terry
-----------------------------------------------------------
Terry,
Thanks for your response. All logs are imported for the loads that are imported. No logs are imported for loads that aren't imported. There is a one to many relationship between loads and logs. Think of a load as a logging truck driving into the mill yard. There can be many logs in that load. They don't have to worry about weighing or measuring the logs on the truck because they alreay received this text file that tells them who sold them those logs, who logged them, the species, grade, length, diameter, etc.. of each log. A log cannot exist in the database without refrencing a valid load. There is one text file, laid out like this

1
4
6
7
7
7
7
the 1, 4, and 6 lines are information for the load, each 7 line has information for one log.
There is no possibility that any of the data changed between the import attempts, with the exception of the loads and logs that imported successfully. As i said, i am reading the connection's error object throughout this entire process. If there were errors returned during the process they would be printed on the error report. The only information i can think of that i havent already stated is that the identity column is incrementing the correct number of times. For example if the last value of 'rec' (the identity in the loads table) was 219,000 and i process 15 loads only the last one may show up in the database. That loads 'rec' value will be 219,015. That tells me that each load was added to the database during the trasaction. One other thing i noticed is that if i monitor the number of locks in performance monitor it will grow at a rate approximately 100 locks for each load that is being imported. At some seemingly arbitrary point during the transaction the locks will drop to 0, then start building again. This drop seems to coincide with the part of the transaction that is 'lost'. I have checked and double checked and the only place i am issuing a rollback or commit is at the very end of the process, outisde of the loop. I also ran it with breakpoints at the commit and rollback. They were never reached until the entire process was finished. In my tests the number of locks have never been anywhere near the configured limit. Thanks again for your input, i hope for some more.
Ruairi



Ruairi

Could your manufacturing facility benefit from real time process monitoring? Would you like your employees to be able to see up to the minute goal and actual production?
For innovative, low cost solutions check out my website.
 
What software do you use? (DB, OS, MDAC)
What transaction tekniques do you use? The built in for the connection object, or MS Transaktion server?

Does this occur every time you process the data and if so, is the error (number of logs that isn't imported, type of logs, etc) constant.

Have you put a breakpoint at the errorhandler?

-Mats
 
Mats,
Thanks for your response. I am using SQL Server 6.5, ODBC connection to a VB client that uses ADO. I am using the .BeginTrans .committrans and .rollbacktrans methods of the ADO connection. This does not occur every time i process data. In fact, the same data may fail to process one time and successfully process the next. I am reading the error object of the connection after each stored procedure is executed. There are no errors occurring. If i have a file of say 15 loads and i process it the first 6 might be missing from the DB after the transaction commits. I can then cut these 6 loads and their associated logs out of the text file and paste them into a new file. Now i process that file and those 6 loads are successfully imported. So this means there is nothing wrong with the data or it would never import successfully. Nothing about this problem seems to be constant. By watching the number of locks using SQL Performance monitor i can see that during the transaction the number of locks will sometimes drop to zero. Any loads that were part of the transaction before SQL dropped the locks are not committed to the DB. I don't know exactly what that tells me. The number of locks may be at 50 or it may be at 6000 when this occurs, again there is no constant. I have sql configured for 10,000 locks, but even if the problem was that it was running out of locks i would see that error message. I'm still scratching my head. Any insight you can provide would be helpful. Ruairi

Could your manufacturing facility benefit from real time process monitoring? Would you like your employees to be able to see up to the minute goal and actual production?
For innovative, low cost solutions check out my website.
 
What version of MDAC are you using?

Can't say I'm very knowledgeble in SQL 6.5 but what happens if you run the stored procedures as standard SQL queries?
This could be a problem with running stored procedures (runs on the server) and controlling the transactions via the connection object (runs via the provider).

-Mats
 
I last downloaded an MDAC update about 2 months ago. As for running the sp's as quereys that isn't an option. There is a lot of complex logic and conditional execution in the stored procedures. Ruairi

Could your manufacturing facility benefit from real time process monitoring? Would you like your employees to be able to see up to the minute goal and actual production?
For innovative, low cost solutions check out my website.
 
Then I suggest you migrate a sample object to transaction server. The transaction handling there should be superior to that of the ADO built in.

-Mats
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top