Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations IamaSherpa on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Better error reporting 1

Status
Not open for further replies.

njadmin

Technical User
Sep 5, 2006
101
US
I have a very simple backup script for a client. It works great and has been running fine.
I just want to get more info from errors. I'm not a programmer but can understand some of it.
Here's the script:



#!/bin/bash
#
#Simple bash backup script to zip the Doc tree to an #external USB HD
#

OUTPUT=/home/backup/backups/Backup-$(date +%Y-%m-%d).zip
BUDIR="/home/tim/Documents/"
FAILED="The Backup script on the server failed. Please make sure you check the system."
GOOD="System data backup of $BUDIR successful."
EMAIL=`cat /root/backupemail.txt`
SENDER="server.xxxxx.xx@xxxxxx.net"


#echo "Starting backup of $BUDIR to $OUTPUT"

/usr/bin/zip -rq $OUTPUT $BUDIR

if [ $? == 0 ]; then

echo $GOOD |mailx -s "Server Backup status" -r ${SENDER} ${EMAIL}

else

echo $FAILED |mailx -s "Server Backup status" -r ${SENDER} ${EMAIL}

fi


I haven't been able to really find an answer on reporting the error from the script after cron runs the job.

Thanks

Jason
 
Hi

One possibility :
Code:
/usr/bin/zip -rq [navy]$OUTPUT[/navy] [navy]$BUDIR[/navy] [highlight][teal]>[/teal] /tmp/output.txt[/highlight]

[b]if[/b] [teal][[/teal] [navy]$?[/navy] [teal]==[/teal] [purple]0[/purple] [teal]];[/teal] [b]then[/b]
  echo [navy]$GOOD[/navy] [teal]|[/teal]mailx -s [green][i]"Server Backup status"[/i][/green] -r [navy]${SENDER}[/navy] [navy]${EMAIL}[/navy]
[b]else[/b]
  echo [navy]$FAILED[/navy] [teal]|[/teal]mailx -s [green][i]"Server Backup status"[/i][/green] -r [navy]${SENDER}[/navy] [highlight]-a /tmp/output.txt[/highlight] [navy]${EMAIL}[/navy]
fi

Feherke.
 
You know why didn't I think of that?
I'll give it a shot, I wasn't sure if you could redirect like that in a script.

I'll report back if it works well.
 
Great suggestion from feherke, btu you might want to change one line...
Code:
/usr/bin/zip -rq $OUTPUT $BUDIR > /tmp/output.txt [highlight]2>&1[/highlight]
Need to capture anything going to [tt]stderr[/tt] too.


 
It is always a good practice to output error logs into a file. In our case (as much as I can remember). I usually output the logs into a file whose filename is the same as the script name. Most of the time depending on the script function, I usually append the logs (>>) and within the scripts I usually echo date and time so when users ask me if something is wrong with some job then I can always have date/time references. However said, you may have to create a script to clean-up your log files occasionally.
 
That worked perfectly!!!

Now if have to find a new way to backup these files since it states file to large.

I guess I can split the zips.

I'll open another thread if I need help with that though.

Thanks all
 
a new way to backup these files since it states file to large
You may try this:
tar -cf - $BUDIR | /usr/bin/zip -q9 $OUTPUT -

Hope This Helps, PH.
FAQ219-2884
FAQ181-2886
 
I just installed 7zip, so I'm splitting the files on 1GB chunks.
So far it's running the test fine. I'm just waiting for it finish.

/usr/bin/7z a -tzip -v1g $OUTPUT $BUDIR > /tmp/backup_output.txt 2>&1
 
My preference

Rather than 2>&1

at the beginning of the script

exec 1>filename.out #Redirect all terminal messages to filename.out
exec 2>filename.err #Redirect all errors to filename.err





Mike

"Whenever I dwell for any length of time on my own shortcomings, they gradually begin to seem mild, harmless, rather engaging little things, not at all like the staring defects in other people's characters."
 
Thanks everyone!!! The new script is working great
and so far no issues with the 7zip package and splitting the files.
 
Hi

If you control the error messages, separate log files can be really useful. But thinking to situations like this :
Code:
[blue]master #[/blue] some command > log 2>&1

[blue]master #[/blue] cat log
loading foo
ERROR : file not found
loading bar
vs.
Code:
[blue]master #[/blue] some command > log.out 2> log.err

[blue]master #[/blue] cat log.out
loading foo
loading bar

[blue]master #[/blue] cat log.err
ERROR : file not found
the unified log file seems better. But of course, is up to the used tools.

Feherke.
 
I think it depends on what [tt]stdout[/tt] is producing.

I agree with feherke. If [tt]stdout[/tt] is just producing status messages, or something like that, then it's usually best to combine it with [tt]stderr[/tt]. Easier to tell when an error happened, plus fewer log files to manage.

The only time I create separate output files is when [tt]stdout[/tt] is producing data of some kind. If it's something to be used later in the script, you may not want error messages in your data.

Something like this...
Code:
#!/bin/ksh

find / -name '*.c' -print > cfiles.dat 2> finderrors.log

while read CFILE
do
    # Do something with the file
    print ${CFILE}
done < cfiles.dat
Just a made up example, but it shows an example of how sometimes you don't want error messages in your data.

 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top