Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Mike Lewis on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Database Design Question-Access

Status
Not open for further replies.

belairbilly

Programmer
Sep 26, 2003
13
0
0
US
I've been reading many peoples questions regarding design on the forum and would like to ask your opinions on a project/system that I have written. I wrote a General Ledger Reconciliation system that gets its data from a as400 mainframe system via a set of queries that extract multiple financial institutions data. The purpose of the system is to split apart an entire General Ledger into Departmental segments of areas of responsibility. Once the data is grouped an export module exports the data to Excel by department and is subsequently emailed to the different departments for them to reconcile account balances, which in turn are returned to the Accounting Department for examination. The structure of the Mainframe data is the following: Company, Office(Branch), Account number, CostCenter, Product Code and end-of-month balance. The design that I adopted was to concatenate the aforementioned fields and build a primary key. So in theory the recon master table would equate to the number of records on the mainframe system. Using normalization, each of the fields have their own table and ID. There are tables for Departments, Responsible Associate, Reviewed By associate, and Prepared by Associate. Needless to say the queries to join this stuff can get pretty hairy. We are talking about 15,000 records max usually for the month. The import module reads in this mainframe data and performs a field by field validation to ensure the lengths are correct for these fields. I then has to lookup the ID values from the respective tables for these fields and update the reconmaster. I have embedded queries that check for new additions, and/or dropped(deleted) accounts so to keep this reconmaster in sync with the mainframe. This process of validating takes about 30 minutes. I think the time is related to the fact that it has to validate so many fields because the data comes in with truncated leading zeroes and needs to be re-formatted to the correct value. Example on the mainframe company would display as 0002, but import file from as400 has 2. That being said, is there a better way to handle this data to make the process speed up. Currently, I have class modules for each of the above fields that search the tables for additions. It is possible each month that new values be added to any one of the tables above, such as a new branch opining etc.

Second part of this question is the export. Management wants a spreadsheet to be built for each department and all the responsible accounts be plugged into that. I have written all the code for the excel creation which builds the spreadsheet fine, by creating an instance of excel while building each workbook whenever a change in department. This process takes 45 minutes to export 11,280 accounts into the various workbooks. Have I normalized this reconmaster too much by having individual tables for all these fields and carrying the accompying ID in the reconmaster table? Is the verification of the fields and adding back of zeroes to fix fields the proper way? Thirdly, is the handling of building this stuff dynamically within a module the best to handle the spreadsheet building? Unfortunately there is lots of bolding and extras that are needed for presentation.

I have tried originally to use Access's export to spreadsheet, but there is a bug where it tries to create a new worksheet instead of populating the data beginning into a certain cell. If that would work, I could have a template that already has the preset column formatting and just drop the data into the spreadsheet with a separate export for each department.

Sorry for the long explanation, but I wanted to provide as much information as I could to draw a picture. Your opinions would be appreciated.
 
belairbilly

Very tough issue, and I am not in the best position to "test". I do commend you on your accomplishments so far.

YouSaid said:
Company, Office(Branch), Account number, CostCenter, Product Code and end-of-month balance ...
concatenate the aforementioned fields and build a primary key

I assume you do not use the Company + Office ... + Balance to create the primary key.

I don't see you mentioning "transactions", so I assume you are only interested in the balances.

I see two things that can slow the import process. I agree that the formatting kills you. And I suspect lack of indexes may also hinder the process. I am sure IBM EBCDEC and ASCII stuff and other differences gets in the way of a simple import.

Is there a way of creating a "translation" table within Ascii. Bring the data in raw, and then use the transaction table, which is appropriately indexed, to pcorss the raw data. The translation table would already contain properly formatted fields - "0002" vs "2".

You can also create a simple routine that goes through translation table and compares it to the AS400 database for changed accounts. This process does not have to be run at month end - rather, you can run during "quiet" times during the month. You may even be able to use an ADOBC connection to access the data directly from the AS400 instead of importing the data for this procedure.

...Moving on.
The only thing I can think of on the export side is indexes. Since this is not a transaction database, you can invest in indexing to improve on performance. For example, you may have the system indexed by account number. But you can also use a collective index, and index by Company + Branch + Account and/or Company + Branch + Account + CostCentre.

Access the data via the most appropriate index.

On the hardware end...
- Do the data manipulation on your workstation, and not on a server. Access copies the data locally anyway during some of the processing.
- Make sure you have adequate bandwidth. For example, your network card is setup for 100 MB FD.
- Reduce the number of hops from the server to your workstation. Try and get your network connection to your PC to the same switch used by the AS400.
- Hopefully you have tons of RAM on your workstation, say 512 MB with adequate space for the TEMP file.

Niffty project - good luck on it.
Richard
 
One of the reasons why the Excel export is running so slowly is that it runs out of process, plus you are dealing with a lot of records and formatting.

There are ways to improve performance, which involves plugging recordsets into arrays and pumping the data into excel in one action instead of row by row. Because Excel is heavily array oriented it handles these recordset arrays from the database as is. This doesn't work for all cases.

I also find another performance enhancement to be using local table processing whenever I must export a lot of data to Excel. I might create a denormalized local table that has all the data in it I need for the export.

Have you considered NOT exporting to a spreadsheet? I have found that many users tend to live in a spreadsheet mentality. Unless the users must have the ability to edit the data they are getting, I would create reports instead.
 
I don't have a choice about the exporting to Excel. These workbooks are sent out to different departments as a tool for them to reconcile their general ledger accounts that fall under their departments responsibility. I'm sure the formatting of the workbooks for each department cause the time issue. I have found an example that I am having trouble to make it work, whereby, you plug the Dept ID in the export query as a parameter. Alter the parameter using the qdf.Parameters!DeptID value, however I'm getting a data conversion error. Where I'm going with this though is to have Template sheet already formatted that is blank, drop the query data into it, save it as a department.xls name. Then get the next department, open the template. This will eliminate the formatting entirely in code. Still working on the conversion deal. Any thoughts on why the parameter won't work.
 
You can post the code for the parameter if it isn't too long. It's hard to determine what's wrong otherwise.
 
If we're going to get involved in code, I would suggest that you would be better off posting those types of specific questions in one of the Access forums.

Good Luck
--------------
To get the most from your Tek-Tips experience, please read FAQ181-2886
As a circle of light increases so does the circumference of darkness around it. - Albert Einstein
 
Re the Excel export...is this data exported in a table format? Or is it in a proprietary-for-print-viewing format? You can use the Jet engine with Excel, treating it as a database. Look into this if you are not doing this (you could instead be looping rows in Excel).
 
Thank you all for comments and I will start a new thread in the Access forum and post the code to see if you guys can help. I have crossed the bridge on the earlier error using the parameters. I believe it has something to do with the criteria I use. Its an auto-number ID field from the Department table. Evidently it gives a data conversion error because the parameter is looking for text type. I went back to my inline query which I knew worked and that solved the problem. I'll post the whole sub and where I'm getting the next error.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top