Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations SkipVought on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Query runs slow, run locally??

Status
Not open for further replies.

cppiston

Programmer
Feb 27, 2002
23
0
0
US
I have an Access DB run off of the Server. All users run the shortcut to access the DB. One of my queries run slow. Is it possible to run a query localy? When I copy the DB to my system it runs fast.. Any suggestions??
 
Normally, you'd use a stored procedure to have the server side execute the query, but you can't do that with an Access database, only with SQL Server, Oracle, or the like.

In fact, you are executing the query locally, but I assume you mean, could you make the data local. Well, you could, but it would be quite a bit of work. First, you'd have to split the database into a front end/back end pair, with the tables in the back end and everything else (plus links to the tables) in the front end. Then, when you want to execute a local query, you'd have to copy all of the table data needed by the query to temporary local tables, then execute the query against the local tables, then delete the local tables. You'd need to do all this in code, giving the user a form or something to start it running. Note that this would only work for Select queries, not action queries.

Before you go to all that trouble, though, it may be that you can dramatically improve the query performance with some tuning. The efforts most likely to pay off are:
1. Create indexes on any columns used to join tables. If a join involves more than one column, create an index on all the columns used to join. (It's ok if these are not unique indexes.)
2. Create indexes on columns used in WHERE clause criteria (i.e., the Criteria row in the query grid). If for some reason you're reluctant to index all such columns, try to choose the ones that will go farthest in eliminating unwanted data. For example, If you're selecting by gender and geographical location, the location will typically eliminate a much larger number of rows than the gender, so index the location.
3. Compact the database, especially after you create the indexes above. This causes Jet to reorganize the data for less fragmentation, and to recalculate the statistics it uses to decide how to execute a query, which can have a significant impact on performance. Then, after you compact, open each query in design view and save it without making changes. Than open it in datasheet view to cause Jet to recompile it and take advantage of the improved statistics.
4. If the above methods aren't enough and your query contains several joins, try to choose the join that will result in the fewest matches, and create a separate query with just those tables joined in that way. Then replace those tables in the original query with the new query. This will force Jet to execute the most restrictive join first, which (if the other joins are indexed as explained above) will minimize the amount of data that has to be transferred over the network.

If possible, you might also want to consult your network specialists about whether another server would give you better speed, or whether they have any other ideas to increase the transmission rate.

There are many other tuning possibilities that can be explored if these don't help enough. Rick Sprague
 
A couple of things to consider. First of all, what you should be doing is to have 2 .mdb files: One on the server which contains all of the global data, and the other on each local computer which contains all the forms, queries, reports, and temp tables. This makes it much cleaner to upgrade to say SQL server for storing your data. Unless you are running SQL server, there is no way to run any queries except locally.

Here's what happens when you run a query on the database you described: Each record is transferred from the server to the PC, if the conditions match, it is saved in memory. Once all the records have been processed, they are displayed. If you are doing multiple queries on the data, you can do a make table query first to copy the data to a local system, then run queries on it. This is much faster, but does not account for any changes to the data made by other users during this time.

If you are starting to get large numbers of records, then you really should consider changing to SQL server. The response time on queries in the setup described above will degrade faster than the increase in data size. One extreme case I saw was a VB/Access based application who's speed decreased logarithimatically with the size of the data (each doubling of data size increased query time by a factor of 10). Response times were roughly
1000 records < 1 second
5000 records ~ 3 seconds
10000 records ~ 30 seconds
20000 records ~ 5 minutes
40000 records ~ 1 hour
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top