Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Mike Lewis on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Large VFP and SQL app Best Practice issue (Over 75 remote sites) 1

Status
Not open for further replies.

CFPR

MIS
May 9, 2003
58
0
0
PR
Please be aware that this is not a purely how-to technical question but a system architecture /design type.

I am building a fairly large VFP 8.0 and SQL 2000 app. The app will be accessed by over 500 users of which about 450 are remote. There are about 75 remotes sites. Each remote site will have a server with the latest version of the APP running. This is necessary as there are issues with the communication lines and some of the users will have to use dial-up. Most will be on a frame-relay type 512k line. The transactions are not monetary.

About 15 to 25 times per day, each remote site has to transmit a ‘waiting to be approved or assigned’ record to the central office. At the central office, the operator will need to see the record together with its supporting child tables, images, and others. If approved, the central office will assign a record number to the ‘record in the quue’ and re-transmit to the remote site the decision, # record number, who approved it, etc.

My questions and issues are the following.
1) What is the best way to send the records from the remote site to the central site. Should I connect directly to the central site DB and append a new record with all child record to a ‘duplicate’ db structure? Should I just send a text file through FTP and let a form with a timer on the central site handle it?

2) I need to be 100% positive that the record and images that were transmitted were received properly at the central server and vice-versa - that the answer was transmitted and received exactly on the remote site. I never use SQL transaction server, and I am not sure if I should or need to use it. Any comments?

Thanks!

MW


 
I have done some work on this type of problem.

Is all the information on the central site anyway? Is SQL SERVER being used to replicate the data to all the sites?

If the answer to above is yes then transmit a reference to the data that is already on the central site.

otherwise it gets more complicated !!

To be 100% sure that all the data is transmitted you will need to calculate some sort of Chksum.

I would suggest that a function is written that converts all of the data for that record to a string then calculate the length. The images can be dealt with using filetostr()

Transmit the data then create a function to check the incoming data. Sounds tedious but it won't take too long.






 
I have done some work on this type of problem.

Is all the information on the central site anyway? Is SQL SERVER being used to replicate the data to all the sites?

If the answer to above is yes then transmit a reference to the data that is already on the central site.

otherwise it gets more complicated !!

To be 100% sure that all the data is transmitted you will need to calculate some sort of Chksum.

I would suggest that a function is written that converts all of the data for that record to a string then calculate the length. The images can be dealt with using filetostr()

Transmit the data then create a function to check the incoming data. Sounds tedious but it won't take too long.


Richard



 
Hi clarkrj. Thanks for the reply. Just to follow up.

Each remote site works on its own. They just need to connect to the central office to 'update' and wait for an approval. I would say that 95% of the communication is from the remote site to the central office, with the other 5% going back.

My biggest concern is that the lines are down and the system cannot connect for an update.
What I have done so far is created a routine to try to establish a connection to the central site first by opening a DBC and trapping the error. If successful, its then simple. I just need to open 6 remote views in the central office and insert anywhere from a total of 6-10 records max. It should be quick as the line are 512k frames-relay ds1. That should take about 1-2 seconds at the most, I hope....:)

Anyway, I have been considering using SQL replication but have not had any time to looking into its functionality. Can you get me started with some pointers on this?


I have though about using SQL 2000 replication
 
Hi
SQL replication is a black art.

Things that you will need to consider

Decide who originates what information on a record, when I was looking at this the remote sites originated most of the information. If a record has to be authorised then that field would be "owned" by the centre.

You will need to add fields to say how many records in related tables or pictures make up a complete record. This can be checked before the centre opens the master record.

The frequency of replication will need to "tuned" Although there is quite a lot of thought & planning I strongly suggest that you hire an SQL server guy to set up the replication, you will have quite a lot of work reprogramming the VFP.

As I remember the major problem is the direction of data transfer, it does have major advantages as all the sites will have copies of all the data. Backups are irrelevent.

We had problems with remote sites doing backups (or not)


Rich
 
I think you should use the VFP client, and (in my opinion) the VFP back end (SQL if you must) - but use Terminal Server to run the remote people. VNC if you are strapped for cash, PCAnywhere if you are slightly short of cash.



Regards

Griff
Keep [Smile]ing
 
Well, actually, I would use SQL for darn near everything.
There are better and more choices using SQL for remote stuff.
When you code anything distributed in VFP, you have to (re)invent the separate/join protocol. This is a time waster.
SQL has 3 forms of replication.
Install MSDE2000 on each server.
Its free! to distrib with VFP.
Thats right, 75 copies, one per site, are all free!
Then read up on replication.
I would recommend Merge but you decide that.

Or you can use a combo of FTP/Text files and DTS/BCP/Bulk Insert.
But this isnt too great for images.
In fact, your images should only be uploaded after approved?

I think you will find the flexibility of DTS worth your while.
I designed the same kind of ETL consolidation in VFP in 1999, then learned how much powerful DTS is.

Then we can list the benefits of SQL:
1) DTS
2) XML
3) Bitmapped indexes like rushmore
4) huge table sizes
5) Task Scheduler
6) Security
7) Compiled stored procedures
8) Indexed views
9) A real Query processor

Fun facts: SQL 2000 wouldnt have bitmaps w/o VFP
Fun Facts: SQL 2000 is a better server than VFP

Its worth it. Get the SQL
Use VFP for what its good at: Forms, Reports, Back-end connections.

 
Hello bytehd. Thanks for you reply.

I agree with you that SQL is the way to go. I have spent a good amount of time going over the benefits and came up with a very similar benefit list like yours.

One of the things that did worry me about VFP is that I once wrote and app that managed about 50k records over a wireless lan. (Building to Building Bridge). When the communicaction broke down, so did the indexes. Boss and the owners weren't very happy. I wonder why??????Besides, its almost done and I can't go back :)...

Just a couple of points.
1) The images have to be sent as a part of the record 'package' as they are required for the approval process. (Fingerprints!) Luckly they are uder 350k.

2) Thanks for the MSDE idea being free. Lucky us that licensing is not an issue, so it SQL 2000 all the way.

3) I have not done any 'real' work with DTS nor replication so I will read up on it. Why 'Merge'? vs what?

4) Finally, will replication do 'full-synchronization'? I just learned that some of the central office t(x)'s have to be synchronized with the remote sites. The central office may add/edit to some of the info received. Keeps getting complicated....

Thanks

MW
 
DTS is a very powerful data importing tool.
Scriptable and programmable out the wazoo.
Usable for cleaning up data as it comes in.
Uses VB Script.
I use it to clean up bad date fields from a Sybase ASA

Merge Replication: copies are all made equal, eventually
Snapshot Replication: click, its refreshed, but on a schedule
Transactional Replication: its always in sync, everywhere, at all times

now, within these modes, you CAN force an update

DTS and Replication are their own universes,
but once you learn them, very powerful tools in your box.

Replication takes your idea of insync data and lets the DB servers keep themselves talking.
Once its set up....just have to monitor it.

Merge is recommended over WAN links cause it picks up when links go down.

Major Replication cavaet: you will need a GUID field in certain replicated tables in order for the sync to take place. Dont replicate major, in use tables because the damn GUID field will kill your PKs or identity fields.

Thats why i replicated small "transaction" tables, which spilled into the main tables after they were approved, etc.
And all this can be done in mesh:
Replication can kick off a Stored Procedure.
A DTS Task can kick off an SP.
an SP can force a replication refresh.

Its quite powerful actually.

one more thing:
you are right about VFP indexes getting hammered over Wans.

:)
George
 
Thanks George. This is the way to go.

MW
 
Get the WROX red book on DTS.
Replication is mainly in any SQL book.
Good luck and have fun!
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top