Hi James,
If you revise the previous posts that I have placed on the subject you will see that I have suggested a phased approach to your reporting and all the components (apart from the final reporting part which should be easy)
Obviously it's up to you on how you code the solution but what I demonstrated was:
1) how to tail -f the source xml and pipe that to the Unix command "TEE" thus producing a parallel file that retains the incoming requests so nothing gets lost of deleted.
2) I also showed how I parsed that resulting file through awk/gwak to pull out the required data from the parallel xml file and how to add timestamps to each of the records in that file if required either for tracking purpose or for maintaining the location within the processing loop.
3) I lastly went on to provide an example script that ran a timed loop through the data which could provide you with a process to simply step through a set number of records which you could then parse through awk and produce a report and then the script steps on again from where it left off.
I really have too much work on to take you any further but there is tones of clues out on Google on how to report with awk now you have a process to 1) save your data, 2) parse the xml, and 3) loop through your data pausing for a period.
Again I wish you luck
Laurie.
If you revise the previous posts that I have placed on the subject you will see that I have suggested a phased approach to your reporting and all the components (apart from the final reporting part which should be easy)
Obviously it's up to you on how you code the solution but what I demonstrated was:
1) how to tail -f the source xml and pipe that to the Unix command "TEE" thus producing a parallel file that retains the incoming requests so nothing gets lost of deleted.
2) I also showed how I parsed that resulting file through awk/gwak to pull out the required data from the parallel xml file and how to add timestamps to each of the records in that file if required either for tracking purpose or for maintaining the location within the processing loop.
3) I lastly went on to provide an example script that ran a timed loop through the data which could provide you with a process to simply step through a set number of records which you could then parse through awk and produce a report and then the script steps on again from where it left off.
I really have too much work on to take you any further but there is tones of clues out on Google on how to report with awk now you have a process to 1) save your data, 2) parse the xml, and 3) loop through your data pausing for a period.
Again I wish you luck
Laurie.