belairbilly
Programmer
I've been reading many peoples questions regarding design on the forum and would like to ask your opinions on a project/system that I have written. I wrote a General Ledger Reconciliation system that gets its data from a as400 mainframe system via a set of queries that extract multiple financial institutions data. The purpose of the system is to split apart an entire General Ledger into Departmental segments of areas of responsibility. Once the data is grouped an export module exports the data to Excel by department and is subsequently emailed to the different departments for them to reconcile account balances, which in turn are returned to the Accounting Department for examination. The structure of the Mainframe data is the following: Company, Office(Branch), Account number, CostCenter, Product Code and end-of-month balance. The design that I adopted was to concatenate the aforementioned fields and build a primary key. So in theory the recon master table would equate to the number of records on the mainframe system. Using normalization, each of the fields have their own table and ID. There are tables for Departments, Responsible Associate, Reviewed By associate, and Prepared by Associate. Needless to say the queries to join this stuff can get pretty hairy. We are talking about 15,000 records max usually for the month. The import module reads in this mainframe data and performs a field by field validation to ensure the lengths are correct for these fields. I then has to lookup the ID values from the respective tables for these fields and update the reconmaster. I have embedded queries that check for new additions, and/or dropped(deleted) accounts so to keep this reconmaster in sync with the mainframe. This process of validating takes about 30 minutes. I think the time is related to the fact that it has to validate so many fields because the data comes in with truncated leading zeroes and needs to be re-formatted to the correct value. Example on the mainframe company would display as 0002, but import file from as400 has 2. That being said, is there a better way to handle this data to make the process speed up. Currently, I have class modules for each of the above fields that search the tables for additions. It is possible each month that new values be added to any one of the tables above, such as a new branch opining etc.
Second part of this question is the export. Management wants a spreadsheet to be built for each department and all the responsible accounts be plugged into that. I have written all the code for the excel creation which builds the spreadsheet fine, by creating an instance of excel while building each workbook whenever a change in department. This process takes 45 minutes to export 11,280 accounts into the various workbooks. Have I normalized this reconmaster too much by having individual tables for all these fields and carrying the accompying ID in the reconmaster table? Is the verification of the fields and adding back of zeroes to fix fields the proper way? Thirdly, is the handling of building this stuff dynamically within a module the best to handle the spreadsheet building? Unfortunately there is lots of bolding and extras that are needed for presentation.
I have tried originally to use Access's export to spreadsheet, but there is a bug where it tries to create a new worksheet instead of populating the data beginning into a certain cell. If that would work, I could have a template that already has the preset column formatting and just drop the data into the spreadsheet with a separate export for each department.
Sorry for the long explanation, but I wanted to provide as much information as I could to draw a picture. Your opinions would be appreciated.
Second part of this question is the export. Management wants a spreadsheet to be built for each department and all the responsible accounts be plugged into that. I have written all the code for the excel creation which builds the spreadsheet fine, by creating an instance of excel while building each workbook whenever a change in department. This process takes 45 minutes to export 11,280 accounts into the various workbooks. Have I normalized this reconmaster too much by having individual tables for all these fields and carrying the accompying ID in the reconmaster table? Is the verification of the fields and adding back of zeroes to fix fields the proper way? Thirdly, is the handling of building this stuff dynamically within a module the best to handle the spreadsheet building? Unfortunately there is lots of bolding and extras that are needed for presentation.
I have tried originally to use Access's export to spreadsheet, but there is a bug where it tries to create a new worksheet instead of populating the data beginning into a certain cell. If that would work, I could have a template that already has the preset column formatting and just drop the data into the spreadsheet with a separate export for each department.
Sorry for the long explanation, but I wanted to provide as much information as I could to draw a picture. Your opinions would be appreciated.