Hi Rob;
O.K. ... here goes...
I've written this application over the last year or so at the request of BlueCross BlueShield of Tennessee. My original app was written to run on a single machine in which I started the process, then went to my PDA, input data, then came back to the machine and printed out the forms. As long as I was the only user, there was no problems of either timing or concurrency. BCBS liked it so much as to suggest I market it. But honestly, while it may have worked for them, it did not work the way I wanted it to. So before going any further with it, I changed the structure.
The current structure is using one machine--windows xp-- (at the front desk) to be a 'server' for my PDA while my receptionist is using the same machine as a 'client' to both print and input data. In addition to the machine at the front desk, there are at least two other 'client' machines on the network.
I've rewritten the dataenvironment of my main app to change to using the database on the server machine directly (done by the cursors in the dataenvironment of the main form). I have not as yet gone to views as they would appear to incurr both more programming and machine cycles. Having all the machines use the same database on the server has made things very easy, quick, and reliable.
However, since data is now being input into the server at random times from the client machines, I constructed three job ques to handle different problems. My initial design had the server polling client machines on a timed basis for data updates. I found that to be relatively slow and ineffecient. So the next design opened the job ques on the server as shared and let the applications on the client machines both update the database directly and notify the 'server' by way of the job ques when new data was ready.
In addition to the client machines inputing data to the database and the job ques, my PDA via wireless 802.11b is serviced by a seperate stay resident program on the 'server' such that when I transmit to the server record changes or additions on the pda, the data is moved into specific files--not part of the database--on the server.
When all is said and done, the single 'timer' event (which I actually modified from cppTimer (freeware) ) then reads the PDA files for changes, moves the data from the pda into the database...and also checks the job ques for pending data to be moved to the pda. One function which is needed immediately is perscription print outs since this has to be done in real time and can't wait to the end of the day or anyother time. It must be done while the patient is still here and it must be 100% perfect 100% of the time.
Hence, in a nutshell, the timer event is needed to force the system to read input from the PDA files and update the PDA files for transfer back to the PDA. In addition to all the many problems, the vfp app runs over a thousand times faster then file sync does from the pda and it seems (I don't have acces to the source code) that the pda is using a record lock scheme via ODBC. The result is I can't seem to figure out when the PDA in syncing with the server and VFP is (and does) completely read/write to the PDA files WHILE the pda is syncing. The result is totally messed up indices and lost data. I have figured out ways arround most of it, but clearly still have problems whenever vfp is syncing with the same file that the pda is using...but that is a different story. The report file here in question does not use the PDA files at all.
I can't think of a better way to force the 'server' machine to update or be updated by the pda.
If you know of a better trigger, I'm open to suggestions.
As it is, I'm getting about 97% error free run-time...I won't be happy with anything less then 100%
If I haven't totally bored you by now, you get a medal!
I hope I've explained the problem,
Thanks again,
Alan