ralphcarter
Programmer
I have a foxpro application that opens 200 tables. About 1/3 of the tables are read and all of the records are written out to a report. The other 2/3 of the tables are referenced only for certain match conditions. Processes that run in 20 sec on a standalone machine are taking 3 minutes over the network. There are less than 5000 records in any of the tables. We chose not to use database just the free tables.
My question is what are the general ways to speed access to data that I can use.
Yes - we do use Locate in different situations where we are trying to find records that match conditions of fields we have not identified as indicies.
Would using SET FILTER command slow things down?
What else should I look for to speed things up?
What are some general optimization tricks?
I know this is an ambiguous description of the problem and I can maybe narrow it down by answering questions. I will admit some ignorance in this situation.
Thanks
Ralph
My question is what are the general ways to speed access to data that I can use.
Yes - we do use Locate in different situations where we are trying to find records that match conditions of fields we have not identified as indicies.
Would using SET FILTER command slow things down?
What else should I look for to speed things up?
What are some general optimization tricks?
I know this is an ambiguous description of the problem and I can maybe narrow it down by answering questions. I will admit some ignorance in this situation.
Thanks
Ralph