The set-up: Oracle-based HR system (MidlandHR Delphi Millennium), from which we run reports (outputting large chunks of the DB tables) on a weekly basis. These are output in a text format, and manually FTP'd across to be picked up by Access download, which converts them into Access tables, so we can run ad-hoc queries on the data for reporting purposes. the advantage of which is that requests for wildly different kinds of info come in frequently, and Access is a very flexible tool to do the job.
The nuts and bolts are as follows, using macros and queries: A 'Raw data' table with no keys and all datatypes set to 'text' is populated using TransferText (fixed width with Import Spec) from the FTP file (text), A table duplicating the datatypes and keys of the original HR system-table is cleared using a Delete query, then repopulated using an Append query pulling data from the Raw table.
However, I think that Access is reaching its limit in terms of the amount of data it can convert. It frequently 'hangs' during the process where it is repopulating the larger tables, whereby I mean that nothing appears to be happening, no visible progress is made for hours on end, but according to Taskmanager Access is 'running', and although I get no error messages, it ties up the PC's CPU to the extent that I can't do anything else with the PC whilst it's running. If I use taskmanager to quit Access (the only way to quit out of it in this circumstance), and reopen the db and rerun the queries, they appear to run more efficiently for a time, but performance is compromised by the number of queries that are run within a macro. Is Jet still holding onto something from each successive query which takes up memory? I'm really at the limit of my knowledge here, so some advice would be most appreciated.
Would more processing power on the PC I'm using to download help? (At least in short-term)
Are we going to have to move away from Access as an ad-hoc reporting tool? We've tried Business Objects but haven't found it flexible enough.
Laters, Z
"42??? We're going to get lynched!
The nuts and bolts are as follows, using macros and queries: A 'Raw data' table with no keys and all datatypes set to 'text' is populated using TransferText (fixed width with Import Spec) from the FTP file (text), A table duplicating the datatypes and keys of the original HR system-table is cleared using a Delete query, then repopulated using an Append query pulling data from the Raw table.
However, I think that Access is reaching its limit in terms of the amount of data it can convert. It frequently 'hangs' during the process where it is repopulating the larger tables, whereby I mean that nothing appears to be happening, no visible progress is made for hours on end, but according to Taskmanager Access is 'running', and although I get no error messages, it ties up the PC's CPU to the extent that I can't do anything else with the PC whilst it's running. If I use taskmanager to quit Access (the only way to quit out of it in this circumstance), and reopen the db and rerun the queries, they appear to run more efficiently for a time, but performance is compromised by the number of queries that are run within a macro. Is Jet still holding onto something from each successive query which takes up memory? I'm really at the limit of my knowledge here, so some advice would be most appreciated.
Would more processing power on the PC I'm using to download help? (At least in short-term)
Are we going to have to move away from Access as an ad-hoc reporting tool? We've tried Business Objects but haven't found it flexible enough.
Laters, Z
"42??? We're going to get lynched!