Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations TouchToneTommy on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Slow Program

Status
Not open for further replies.

aw23

Programmer
Nov 26, 2003
544
IL
I have a huge access program with a database with several tables one of which has 30,000 records. The program has slowed down significantly and I'm trying to figure out how to speed it up. I have just added 20,000 of the 30,000 records. WOuld this be the cause? What can I do to figure out the cause and then speed it up? I am looking for a starting point as I am not sure what to look at.
Thanks in advance for any ideas.
 
It is very likely, i was told although you are obviously prooving this wrong, that MS Access was only good for @ 10,000 records.

Do you have the tables in the same DB as the code & forms ?

Do you have access to a SQL server, it's fairly easy to import an Access DB into sequal and then point the code MDB file at the SQL instead.

have you done a compact/repair on the DB to see if that helps?
 
30 000 isn't huge. Very small, IMO.

But hey.

Have you compacted the DB? Do you regularly create temporary tables? These can create large file bloat resulting in slow execution times.
 
Access is good for 2GB! That's a lot bigger than 10 000 records as a general rule.
 
Are all your tables properly indexed ?

Hope This Helps, PH.
Want to get great answers to your Tek-Tips questions? Have a look at FAQ219-2884 or FAQ222-2244
 
yep, if you suddenly increase your records from 10k to 30k, then a slowdown is quite likely...

at what point is the slowdown most noticeable?
when you first start the db?
when you run queries?
when you save changes?
when you compact/repair?
...

--------------------
Procrastinate Now!
 
I'm glad the record size thing has been cleared up - I did kinda worry when I asked a programer and they said 10k.

all records are in SQL now anyhow but at least I know for future reference - cheers guys!
 
Thanks for all the replies!
Craig, I don't create any temporary tables and I the database is compacted.
PHV, I think indexing may be the problem. I need to look over that. Would you be able to direct me to an article or something which explains proper indexing?
Crowley, the problem seems to be when queries are run.

Thanks!
 
PHV,
I have just done a couple of searches and have read a little on indexing. On all my primary key fields the index property is set to yes. What more constitutes proper indexing?
 
Where are your queries being run on?

Index fields (and combination of) which are commonly searched/joined on.

Are you pulling the minimum amount of data in each query?
 
I have 2 seperate filee, BE, and FE.
When I run queries I don't do anything with the index. Am I supposed to?
What do you mean by where are the queries being run on?
I believe I am pulling in the min amount of data. I will check though.
 
Just some backround info, I don't know if it will help or not:
I have several users although under 10 all hitting the database at the same time. The FE and BE are both stored on my computer in the same folder and all the users are running off of that.
 
You don't deal with the index. Jet knows what to do with it. But you do need to index the right field.

Front end should be on the users own PC. You wouldn't run Word from someone else's machine, would you?

And the BE should be on a network shared drive. Your PC will not optimised as a server so will be fairly slow to respond to requests.
 
The problem is that I'm constantly updating the program. So I am always putting up the new version. I simply save it to the shared drive which is on my computer and then I save the new version usually once a day. If it were to run on every machine locally the users would constantly be changing it themselves.
 
Then your consequence is a low speed.

You can't have things all ways.
 
Ok, I just tried it. I put it on the users desktop and the performance seemed to be even slower!
 
What version of Access are you running and what is the O/S of the shared drive?

I only ask because I previously encounter speed problems with an A2000 database that resided on a server running an older version of NT. These problems were resolved by moving the database to a server running Windows 2000. There is a Microsoft KnowledgeBase article on this problem, but I forget the reference number.

Other things to try:

- Use Access to analyse the structure of your database (Tools, Analyse, Performance) - this will give you some tips for things to look at (it may even suggest some candidate table indexes) but take it all with a pinch of salt - these are only suggestions, and you should consider them carefully.

- Compact/repair your db REGULARLY. You can easily set this up in a scheduled manner using something like HAL, a freeware Access scheduling utility available from
- Do you use a lot of macro-driven actions? If these are slow you might find a performance improvement by changing everything to VBA code, especially if you subsequently compile the VBA code into an MDE database.

- Do you use graphics in forms? Are they necessary? They're big and slow down form operations, so take such visual niceties out if they're not essential.

Other than these suggestions, it's difficult to know what else to recommend without actually seeing your db....

...hope this helps anyway.
 
Thank s so much. I am running Access 2002 on XP.
I have tried all the suggestions, to no avail.
 
I have downloaded the frontend on to all users pc. It is slightly faster but I am still looking for more improvement.
 
It sounds like perhaps you are using all of your memory when going to the larger number of records. When performing joins and complex queries, the computer loads as much data as possible into memory and performs the required action. If you are out of memory, the system is constantly reading from disk, performing the action, writing to disk, and repeating until everything is done. Also, if you are using the DISTINCT keyword you will find your time for your queries increase in time N^2 which is increadibly high when you are having to read and write to disk. Make sure you only use DISTINCT if you must. Make sure you are not performing calculations for each record if possible, for example if you are trying to find only items between the first and last day of January then instead of having SQL do the calcuation to find the first and last days of the month in every record, pass the exact dates. If you are setting the query source to items on your form manually on page load or open be sure you do not also have the form bound to the data or it will run the query twice. Lastly, if you are performing a lot of joins between different tables, you may denormalize a step to remove some of the joins, though your database size will most likely grow larger you won't have to join the tables.

There's always a trade off somewhere, so I hope you find it!
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top