Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Chris Miller on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Reading and parsing application log

Status
Not open for further replies.

theniteowl

Programmer
May 24, 2005
1,975
US
Hi All,
We have an application that creates multiple log files.
The log gets split off if it goes over a certain size or beyond a certain date.
The effect is that we will have system.log, system_001, 002, 003, 004.log and there could be multiple dates contained in one log file or one date that spans several files. The files are appended to so the older date/time is at the top and the newer at the bottom.

I want to read through these breaking the entries out into individual files for each date and it looks like it will get a little comlicated.

Some of these files can get very large. I was thinking I could just append all the files together into one file then all the entries would be sequential and it would greatly simplify my script. But this could mean an enormous file to process. How good would VBS be a parsing through the text of a very large file? We could be looking at as much as 60 MB of text.

Not only do I want to separate the logs into individual date logs I want to skip processing any dates that have already been done and only have my script pulling out new data and this is where it gets more complicated.

Thoughts?

At my age I still learn something new every day, but I forget two others.
 
Not a VBS answer but if a 'log' within the file is/ can be delimited predictably into 'columns', you may consider importing it/ appending to it using MS Access.
 
Not a VBS answer but if a 'log' within the file is/ can be delimited predictably into 'columns', you may consider importing it/ appending to it using MS Access.

Our purpose is simply to make backups of the log data in text files and keep the last months worth of data. The application does not keep more than 6 log files which are split both by time and size and generally only cover about a weeks worth of data.
We cannot simply backup the files because they are all renamed when a new log is created and the rest are moved down in number until they fall off the end. We could keep them in different folders but then we have overlapping data in different files and it is impossible to know which date is in which file because the same day could be represented in multiple logs if there were a lot of entries.

I came up with an approach. I found some code that does a VERY fast search through a file looking for a specific text string. Every record in the logs begins with a date field. I will write a separate file that has the last date the logs were processed. I will use that date plus one day to search the log files in reverse order for the first one that contains that date in a record then write each entry into it's own file for that date. Then I will increment the date by another day and do the same thing until I reach the current day. This way I only have to line by line parse through a file if I already know it contains the data I am looking for so no time is wasted in large files that are irrelevant and I never process any records that have previously been processed.


At my age I still learn something new every day, but I forget two others.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top