Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations gkittelson on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Script help! 1

Status
Not open for further replies.

Larshg

Programmer
Mar 1, 2001
187
DK
I need to modify this script.

Right now it ONE checks log files for how the job ended, I need it to check a range of logfiles 15 to 20 and I need it to take the logfiles that are created AFTER 10 am. today.

I know the way the log files that I need to check looks
“NAME”_”jobreg-not importent”_”SND”_”DATE-YYYYMMDD”_”TIME-HHMMSS”.log

I know the “NAME”, “jobreg” and “SND” of alle the logfiles(they are the same every day)
Here are some examples of the logfiles that I want to look at notice that the only time that differs in the files are the date.

BLPREP_10N01022001_SND_”date after today at 10 am.”
ARPHPPRCVRY_BYREQ_SND_”date after today at 10 am.”
ARPHPRSPL_ARPAY_DDB_SND_”date after today at 10 am.”
ARPHPRCHK_ARPAY_DDB_SND_”date after today at 10 am.”
ARPB01DAYREP_ENDDAY_SND_”date after today at 10 am.”


more larstjeklog.sh
#!/bin/ksh
found=``
status=``
echo $status

# Check number of parameters
if [ $# != 1 ]
then
echo SYNTAX : Insert logfile as parameter
echo EXAMPLE : tjeklog.sh BLPREP_10N01022001_SND_20010204_070223.log
return 1
fi
print ""
while [ "$found" = "" ]
do
logfile=`ls /usr/users/operator/var/snd/log/$1`
status=`tail -10 ${logfile} | grep "Operational Job ended" `
#status=`tail -1 ${logfile} | grep "0" `
echo $status
if [ "$status" != "" ]
then
found=1
echo "$status" |mailx -s Status test@hotmail.com
else
echo "Jobs still running"
fi
sleep 100
done
exit;
 
Are you looking to check all log files that have
the the current day's date and a time stamp of
10AM or later for "Operational Job ended"?

Is the time stamp in the file name or
are you expecting to get the time stamp
from the ls command?

not sure if this is the best way, but I think it should work.

you can change the initial greps to your liking

DATE_FILTER=`date|awk '{print $2,$3}'`

ls -l [your log directory here]|grep "$DATE_FILTER"|grep SND|awk '{print $8}'|awk -F":" '{print $1}'|while read LINE
do
if [ $LINE -ge 10 ] && [ $LINE -le 24 ]
then
echo $LINE >> /tmp/tmp_file.$$
fi
done

/tmp/tmp_file.$$ should contain files with today's date
between 10 am and midnight

Robert
 
Yes, i want to check all the log files.

Yes, the time stamp is in the file name.

I think I miss leed you a bit, the log files realy only consist og 2 parts "Name"_"date_time"
ex.
ARPHPPRCVRY_BYREQ_SND_20020219_071555.log
This is the name = ARPHPPRCVRY_BYREQ_SND_
This is the date_time = 20020219_071555

I not quite sure that I understand what youe script does - don't know very much about awk. could you explain what you do?

Another thing is that this script has to check the log files from 10 AM today and it should not stop at midnight - meening if a logfile is createde after midnight it has to be chackes as well.

The result of the script should be a log file that contains alle the logfiles that has "Operational Job ended" written in them, from 10 AM and until a nother criteria is meet(a different logfile writes "Operational Job ended")

Hope you can help.
 
let me summarize what I see in your existing script

1. you enter larstjeklog.sh followed by the name of the logfile to be checked
2. the last 10 lines of the log file are checked for "Operational Job ended"
3. A result of one or more will email the number
of "Operational Job ended" in the last line of the file
4. A result of 0 will echo to the screen "Jobs still running"

the way awk was just being used to return one field
from the standard out of an ls -l command

try this at the command line and you will see

ls -l
ls -l|awk '{print $2,$3}'

the second command will only result
in the second field $2 and the third field $3
being printed to screen.

I am confused by your 10AM time requirement.
What time of the day will you run the script,
or will you run it at all times of the day?
If I were to run it at 2:00 AM on the 19th,
do you want to find all files after 10:00 AM
on the 18th through 10AM on the 19th? I understand your file naming conventions, but tell me how a person not program would be able to to distinguish which files to look
at and I should be able to help you.

Robert
 
Ex. of how the script should run
The script it startede "Tue Feb 19 08:53:54 MET 2002" and it should look at logfiles from "Tue Feb 19 10:00:00 MET 2002" and forward in all eternity(until it is stopped)

The way I look at the log files is by making a ls -lrt Name* and then check it the latest file has a time stamp later then 10 AM. today.

Another ting I need to explain is that the script should not look at all the logfiles in the directory only the ones I specify.

And the way the script should terminate is when a specifik logfile has the "Operational Job ended"
the name og this job is "EOD_SLUT_ENDDAY_SND_"
 
sorry still need more clarification

1. It looks like you are trying to emulate a daemon process
by having a script which runs unitl you kill it.
I think you would be better off having script run
via cron say once a minute, once every 5 miutes or
once an hour depending on your needs.

2. It may just be easier to achive files you have
already checked or are done with rather than
having to depend on the time stamp.

3. in your last note you mention ls -ltr which would imply
you're looking for files most recently written to.

4. still not sure if your are going the the time in the file name or bye the time from the time stamp.

I I understand what your current script is doing,
maybe if you could tell me ONLY how you want the new
script to be different from the old.

if this all to confusing... you can reach me via
telephone @ 847-700-2335

Robert



 
I think I understand what you questions are, and I'll try to answere them.

1) I thought about a crontab job - and that is deffently a good posibility.

2) I can't archive the files that I have already checked, becouse It is not my files, the script funktion is a check if there are any job that fails(the jobs that fails writes in their logfile Operational Job ended" if they fails)

3) Yes I am manualy looking at the file that has been files most recently written to, but this is only because Its the manual way to do it - I have to know if the job that I looking at has been activatede

4) I am only looking at the time stamp in the name of the log file. this is why I want to look at all the logfiles that has been createde after 10 AM.


I like the ideer of using the crontab to execute the script, it solves part of the problem in one go'. And then the only thing the script should do is.
1) Look at a number of log files with have the "names" that I specify(likely between 10 and 20 names)
2) Write "Operational Job ended" in a log file (posibly also the name of the job and a time stamp of wheen it failed)





 
1. please send me a copy of your standard out from your ls -ltr command in your log directory.

2. Do you want to give me the 10 to 20 names that you want
to look for and I can put them in the script,
or I could have the script read config file with names or partial names that you want to look for?

3. I'm curious. What's the significance of 10 AM?
It might help me better understand your ultimate goal
which seems to be something like "I want to be emailed when
there is a critical failure in the log."

sorry for all the questions, but that will keep me from
rewriting this script several times.

Robert

 
sorry about the late replay

1. I can't give you an output for the ls -lrt - because there are about 500.000 files in the directory så it will be deficult to give you an output - but I can pipe it to a file and give you that, if you want.

2. I can give you the names, but I need to be able to add more name.

3. Your right about the ultimate goal, I have a progam that runs a lot of jobs during the night - all thise jobs generate a log file each. - this is repeatede evry night(and is genneraly startede at 10 am). If a job fails durring the night it writes "Operational Job ended" in the log file - and then I want an email. the importent thing is that I only want an email if it is a NEW fault(not all the old log files that endede with error)

I had an ideer on how to do it, If the script runs in the crontab, and tjeks all the logfiles that has been createde in the last 2 days and then checks a file with the name of those it already email me about, and then sends me an email if there are new ones, when is has emailed me it add the name of the logfile in the file. - this would solve the problems with 10 am and hopefully give me alle the emails that I need.

Hope this helps.

/Lars
 
Yes, this gives me a much better idea of what you are
doing.

Your directory listing isn't necessary
for me to write the script.

I think we could possibly ignore the names and dates
of the files altogether.

the find command allows an option -mtime

so the command: find -mtime -1
would find all files modified in less than a day
or you could use: find -mtime -2

This would bypass all logfiles that are no longer being
written to.

It sounds like a new log file is created for each job run,
rather than many jobs writing to the same log file which is
good.

I will send you something soon.

Robert Robert G. Jordan

Robert@JORDAN200.com
Unix Admin, United Airlines
 
#!/bin/ksh
# check_jobs script

found=``
status=``

TMP_LIST=/tmp/TMP_LIST.$$
LOG_FILE=/tmp/check_jobs.log
LOG_DIR=/usr/users/operator/var/snd/log
TEXT="Operational Job ended"
FIND=/usr/bin/find
MAIL="/usr/bin/mailx -s "
DATE=`/usr/bin/date`


Page_Me ()
{
echo "$DATE $MESSAGE"|$MAIL Status test@hotmail.com
echo "Sending Page"
}

Find_New_Logs () #find recent logs
{
$FIND $LOG_DIR -prune -only -mtime -1 >> $TMP_LIST.New_Logs
# the -prune and -only options exclude sub directories
}

Check_Logs () #check logs for errors
{
cat $TMP_LIST.New_Logs|while read LINE
do
LOG=$LINE
echo "reading $LOG...\c"
STATUS=`tail -10 $LOG|grep "$TEXT"`
if [ ${#STATUS} -gt 0 ]
then
MESSAGE="$TEXT FOUND in $LOG"
echo "$DATE $MESSAGE"
echo "$DATE $MESSAGE" >> $LOG_FILE
Page_Me
fi
echo
done
}

# Main
Find_New_Logs
Check_Logs
Robert G. Jordan

Robert@JORDAN200.com
Unix Admin, United Airlines
 
Hi

I stille need only to check only some specifik job names, because the log directory contains the log files from the entiry system, and I only want mails from certain jobs.

I can't see if your script only give me a mail it a new job fails? - meening that when a job fails I only want one mail - and not a mail evry time the script is runned from the crontab.

I have another question - i not sure I understand the last 3 lines in your script?
# Main
Find_New_Logs
Check_Logs
What do they do?


Regards
Lars Grynderup
LSG@sonofon.dk
Sonofon, DENMARK
 
This script should now only check a job log file only once.
It stores a history of files previously checked in
the SCRIPT_LOG_FILE. I would recommend moving the file
to a directory other than /tmp as the log file may be
erased. To answer your question regarding the last lines
of the script. Those are function calls. Functions can
be used over and over again in a script. I prevents
one from writing duplicate code. For example, say you
want alter this script and send a message for successes
as well as failures. All you would have to do is set a
value for the $MESSAGE variable call the paging function.

Functions have this format

Sample_Function ()
{
Place your reusable code here
}

Functions be called by placing their name
anywhere in a script

Sample_Function # this would call the Sample_Function

Let me know if this solves your problem
or if you still need to add naming filters,
let me know what those filters are.

Robert

#!/bin/ksh
# check_jobs script

TMP_LIST=/tmp/TMP_LIST.$$
SCRIPT_LOG_FILE=/tmp/check_jobs.log
LOG_DIR=/usr/users/operator/var/snd/log
TEXT="Operational Job ended"
FIND=/usr/bin/find
MAIL="/usr/bin/mailx -s "
DATE=`/usr/bin/date`


Page_Me ()
{
echo "$DATE $MESSAGE"|$MAIL Status test@hotmail.com
echo "Sending Page"
}

Find_New_Logs () #find recent logs
{
$FIND $LOG_DIR -prune -only -mtime -1 >> $TMP_LIST.New_Logs
# the -prune and -only options exclude sub directories

# NEW SECTION TO MAKE SURE YOU ARE NOT PAGED TWICE FOR THE SAME LOG
cat $TMP_LIST.New_Logs|while read LINE
do
LOG=$LINE
STATUS=`grep $LOG $SCRIPT_LOG_FILE`
if [ ${#STATUS} -gt 0 ]
then
echo "Previously checked this file. Skipping!"
else
# this log has never been checked before
echo $LOG >> $TMP_LIST.New_Logs_Filtered
fi
done
}

Check_Logs () #check logs for errors
{
cat $TMP_LIST.New_Logs_Filtered|while read LINE
do
LOG=$LINE
echo "reading $LOG...\c"
STATUS=`tail -10 $LOG|grep "$TEXT"`
if [ ${#STATUS} -gt 0 ]
then
MESSAGE="$TEXT FOUND in $LOG"
echo "$DATE $MESSAGE"
echo "$DATE $MESSAGE" >> $SCRIPT_LOG_FILE
Page_Me
else
MESSAGE="$LOG OK"
echo "$DATE $MESSAGE"
echo "$DATE $MESSAGE" >> $SCRIPT_LOG_FILE
fi
echo
done
}

# Main
Find_New_Logs # Calls the function above
Check_Logs # Calls tge function above
Robert G. Jordan

Robert@JORDAN2000.com
Unix Admin, United Airlines
 
I still need naming filters, I would like it to name parts, meening it should be able to take *ARPHPP*.
Here are most of the names - I also need to be able to add more names myself.
*ARPHPP*
SWITCH*
OPTUXBOOT*
*ENDDAY*
ARCL*
BLCYRRT_L*
*SAS*
*ENDMON*
*ENDWEEK*
*SUNDAY*


Regards
Lars Grynderup
LSG@sonofon.dk
Sonofon, DENMARK
 
The addition of the NAME_FILTER variable
and the egrep statement should work,
but you will have to test on your system.

Robert

#!/bin/ksh
# check_jobs script

TMP_LIST=/tmp/TMP_LIST.$$
SCRIPT_LOG_FILE=/tmp/check_jobs.log
LOG_DIR=/usr/users/operator/var/snd/log
TEXT="Operational Job ended"
FIND=/usr/bin/find
MAIL="/usr/bin/mailx -s "
DATE=`/usr/bin/date`
NAME_FILTER="ARPHPP|SWITCH|OPTUXBOOT|ENDDAY|ARCL|BLCYRRT_L|SAS|ENDMON|ENDWEEK|SUNDAY"
# only files listed in the name filter will be considered
# you can add or remove names at any time - you just need a | between entries

Page_Me ()
{
echo "$DATE $MESSAGE"|$MAIL Status test@hotmail.com
echo "Sending Page"
}

Find_New_Logs () #find recent logs
{
$FIND $LOG_DIR -prune -only -mtime -1 >> $TMP_LIST.New_Logs
# the -prune and -only options exclude sub directories

# NEW SECTION TO MAKE SURE YOU ARE NOT PAGED TWICE FOR THE SAME LOG
cat $TMP_LIST.New_Logs|egrep "$NAME_FILTER"|while read LINE
do
LOG=$LINE
STATUS=`grep $LOG $SCRIPT_LOG_FILE`
if [ ${#STATUS} -gt 0 ]
then
echo "Previously checked this file. Skipping!"
else
# this log has never been checked before
echo $LOG >> $TMP_LIST.New_Logs_Filtered
fi
done
}

Check_Logs () #check logs for errors
{
cat $TMP_LIST.New_Logs_Filtered|while read LINE
do
LOG=$LINE
echo "reading $LOG...\c"
STATUS=`tail -10 $LOG|grep "$TEXT"`
if [ ${#STATUS} -gt 0 ]
then
MESSAGE="$TEXT FOUND in $LOG"
echo "$DATE $MESSAGE"
echo "$DATE $MESSAGE" >> $SCRIPT_LOG_FILE
Page_Me
else
MESSAGE="$LOG OK"
echo "$DATE $MESSAGE"
echo "$DATE $MESSAGE" >> $SCRIPT_LOG_FILE
fi
echo
done
}
Robert G. Jordan

Robert@JORDAN2000.com
Unix Admin, United Airlines
 
I forgot, you still need these three lines
(or at least the last two)
at the end of the script or it won't run.

# Main
Find_New_Logs # Calls the function above
Check_Logs # Calls tge function above
Robert G. Jordan

Robert@JORDAN2000.com
Unix Admin, United Airlines
 
Hi

Great script - I testede it for a while and it works great, but now I want to make another Function that sends an SMS, but that SMS should only hold the name of the logfile the failede - the funktion that I've made works but it ofcourse gives me a lot more then just the name, is there any way to take out the rest, I've triede using awk but whit little success.


SMS_Me ()
{

echo "$DATE $MESSAGE"|$MAIL EOD_job_fail 72126924@note.sonofon.dk
echo "SMS the operator"
}


My log file looks like this
Wed Mar 6 13:35:19 MET 2002 Operational Job ended with failure FOUND in /usr/users/operator/var/snd/log/DMDISP_ENDDAYERR_SND_20020304_154514.log

the only thing I want in the SMS is DMDISP_ENDDAYERR.

That meens that everything before and after.

Thanks

Lars
 
I have another couble of question
Would it be deficult to place the NAME_FILTER in a file insted of in the script, because I think I found other places where this filter would be usefull

I would also like my mail accont and the sms telephone number to be in another file :)

regards
/Lars
 
Yes, you can

if your SMS message is always the same, use this...
SMS_MESSAGE="DMDISP_ENDDAYERR"

OR if it varies, use the awk statement in the function below...

SMS_Me ()
{
SMS_MESSAGE=`echo $MESSAGE|awk -F"/" '{print $NF}'|awk -F"|awk -F"_SND" '{print $1}'`
echo "$DATE $MESSAGE"|$MAIL EOD_job_fail72126924@note.sonofon.dk
echo "SMS the operator"
}


Wed Mar 6 13:35:19 MET 2002 Operational Job ended with failure FOUND in /usr/users/operator/var/snd/log/DMDISP_ENDDAYERR_SND_20020304_154514.log

The first awk state says that "/" is a field delimeter and $NF means print the
last field which would be everything to the right of the last slash.
result of first awk=DMDISP_ENDDAYERR_SND_20020304_154514.log

The second awk statement says that "_SND" is a field delimeter and to print
eveything to the left of it.
result of second awk=DMDISP_ENDDAYERR

As for putting the

NAME_FILTER=`cat [some file here]|head -1`
the format of the file should be:
filter1|filter2|filter3 with "|" between each entry


MAIL_ACCOUNT=`cat [some file here]|head -1`
the format of the file should be:
emailaddress@domain

make sure the data is on the first line in the files
the head -1 will correct the problem where you may place
extra blank lines in your files

Robert Robert G. Jordan

Robert@JORDAN2000.com
Unix Sys Admin
Chicago, Illinois U.S.A.
 
Here's a way to do it with one file:

Config file (filter on line one and email address on line2:
ARPHPP|SWITCH|OPTUXBOOT|ENDDAY|ARCL|BLCYRRT_L|SAS|ENDMON|ENDWEEK|SUNDAY
EOD_job_fail72126924@note.sonofon.dk

CONFIG_FILE=[your config file name here]
NAME_FILTER=`awk "NR==1{print;exit}" $CONFIG_FILE` #gets the first line from config file
MAIL_ACCOUNT=`awk "NR==2{print;exit}" $CONFIG_FILE` #gets the second line from config file

Robert Robert G. Jordan

Robert@JORDAN2000.com
Unix Sys Admin
Chicago, Illinois U.S.A.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top