mdProgrammer
Programmer
I have a question on optimizing a query for MS Access. The database that is being used is an Access 2002/2003 database - with everyone using Access 2007 - accessing SQL Server 2000 (we're in the process of upgrading to SQL Server 2008 now).
This database is a work review database, with local tables (only a few thousand records), and linked tables to the major work database. There's a query that collects all kinds of information (clients, counselors, staff, office locations, etc.), and is joined to the local tables. When the database was first made 4 or 5 years age, it was manageable (maybe 30 seconds tops), but now the query results in over 140,000 records (taking up to 5 minutes or more). To note, this SQL Server database is a 3rd party database, so we can't add indexes ourselves.
As it stands now, the main query works like this -
Query1 or table (less than 3,000 records) joins with the main query (140,000 records, total of 9 tables being joined). The reason is when a review is done, I don't know which record will be accessed. A pass-through query wouldn't work since the parameters can change.
For now, I have a SQL Job to populate a worktable on the Access database (yes, I know 140,000 records on MS Access isn't pretty...). This is ok when I get a field from the the small table when it's joined to the large table, but when I pick a field from the large table, the query goes from taking 2 seconds, to 2 minutes.
The other option I thought of was to put foriegn keys into the small table, but then it would only be snapshot data, and may or may not match up with the actual data at a later date.
Any suggestions? I'm just looking to speed this thing up a bit. (I'm not including any code - I believe it would be proprietary code)
This database is a work review database, with local tables (only a few thousand records), and linked tables to the major work database. There's a query that collects all kinds of information (clients, counselors, staff, office locations, etc.), and is joined to the local tables. When the database was first made 4 or 5 years age, it was manageable (maybe 30 seconds tops), but now the query results in over 140,000 records (taking up to 5 minutes or more). To note, this SQL Server database is a 3rd party database, so we can't add indexes ourselves.
As it stands now, the main query works like this -
Query1 or table (less than 3,000 records) joins with the main query (140,000 records, total of 9 tables being joined). The reason is when a review is done, I don't know which record will be accessed. A pass-through query wouldn't work since the parameters can change.
For now, I have a SQL Job to populate a worktable on the Access database (yes, I know 140,000 records on MS Access isn't pretty...). This is ok when I get a field from the the small table when it's joined to the large table, but when I pick a field from the large table, the query goes from taking 2 seconds, to 2 minutes.
The other option I thought of was to put foriegn keys into the small table, but then it would only be snapshot data, and may or may not match up with the actual data at a later date.
Any suggestions? I'm just looking to speed this thing up a bit. (I'm not including any code - I believe it would be proprietary code)