Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations IamaSherpa on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Theoretical Question 1

Status
Not open for further replies.

georgesOne

Technical User
Jul 2, 2004
176
JP
Hi everybody,

I have a table A, which includes among others a date field and a hierarchy field.
I want to add new records to this table A, which are generated dynamically through the input of a form with the following conditions:

1. ADD THE NEW RECORD ONLY, if the table A does not contain a record with higher hierarchy within a certain (actually varying) time range.
2. If the record is added then check the table A for records with lower hierarchy within that specific time range and DELETE THOSE RECORDS.
3. The table A is a backend table on the server, so I would like to minimize network traffic.
4. The table A is pretty large, so I need to restrict the relevant records with a query, which unfortunately is not updatable, and will yield on average approximately 250 - 500 records.
5. The new data are about 50 - 100 records.

My idea was to first add all new records to table A and then to run through the records of the query, identify the matching records and eventually delete the match of lower hierarchy in table A. However, it runs slow.

Now I think it might be better not to add the new records in the first place but to construct two local 'help' tables - one containing the new records, one the relevant existing records in the original table A, then to compare and mark them for the appropriate deletions, and finally delete all matching records in the table A and add the new data remaining from one of the help tables.

However, I wonder, if I will get any better performance, and if there is not a better way...
Any good idea?

Thanks, Georges
 
You mentioned a query that was not updatable. You may want to read:
Harnessing the Power of Updatable Queries

Network issues: The Jet Database you're using is a file server type database engine. It simply stores and retreives data. So ALL of TableA must be transferred across the network to your workstation to be worked on by the query.
Obviously, if this is a large table, 1G, that'll take some time. Also, you didn't say how many people are hitting the database over the network which is another factor.

You may want to look at using the MSDE engine (a mini SQL engine) which is a client/server engine. The "query" request is sent over to the table, processed and only the result is sent back. Cuts down on network traffic.

Your problem seems to be at the workstation.
First thing I'd do is to see if your tables are normalized.

What's the physical attributes of your computer - Ram size, cache size, defragmented storage, etc.

There are some Jet registry settings you can play with usually found at \HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Jet\4.0\Engines\Jet 4.0

New queries must be compiled by Access which takes time. Compile them first.

And, as you're suggesting, actually try different ways to do the work you want done. Unless it's actually tried, no one can say what way is quicker or more efficient.

But, if your tables are not normalized, then it's no use continuing to anything else.
 
Hi fneily,

thanks for the valuable input.
Somtimes, a question becomes more clear during the formulation of that question... that happened here.
I went with the (slightly modified) first solution and it works reasonably well.
The tables are normalized, may be even somewhat 'over normalized' and that has caused that I wrote some unnecessary code, which I could get rid off.

Thanks again, Georges
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top