...<node second node>
<hook>hooky AIX thing
</hook>
</node>
<node third node>
<hook>hooky solaris thing
</hook>
</node>';
$string2 =~ s/\<node.*?Solaris.*?\<\/node\>/replacement /msi;
print $string2;
Now, according to me this should just take out the third solaris node but the result is...
If there's a mapping between the file group name and the directory it ends up in then use the suid bit on the directory.
For example
mkdir -m2777 test_dir
chown user:test_grp test_dir
Any files created in test_dir will have test_grp group ownership.
On the internet no one knows you're a dog...
So, I'm working with data lines which end with a return code and an optional non numeric message code. What I have so far is
my @bits = split /\s+/;
if ( $bits[-1] =~ /^\d+$/ )
{ $retcode = $bits[-1]; }
else
{ $retcode = $bits[-2]; }
No problem with that, it works but why can't I use
my...
...values in entry fields.
Press Enter AFTER making all desired changes.
[TOP] [Entry Fields]
* Installation Target b05301
* Installation TYPE mksysb
* SPOT...
LKBrwnDBA
I would love to go there directly but firewall rules, yada, yada... and the double jump is my only option.
Your code works perfectly
Thanks
On the internet no one knows you're a dog
Columb Healy
I'm writing a procedure in bash which copies files around a number of servers. As a basic check I'm runningssh remote_host 'cd /remote/dir;ls | wc -l'
So far all well and good.
However some of the servers are only reachable via a double jump. i.e. to run a comand you have to runssh remote1 ssh...
Gotcha -
It's sort of
First line - store in prev
Subsequent lines - print prev then store new line in prev - i.e. print the line before.
When completed print reformatted final line
Thanks Feherke - you're a star as ever.
On the internet no one knows you're a dog
Columb Healy
I have a perl script I would like to migrate to awk. It's quite simple - it copies a file and amends the footer to reflect the number of records (line count so far minus one for the header). In perl it looks like#!/usr/bin/perl -w
use strict;
$#ARGV == 2 or die "Invalid agument count\n";
open...
We've been told to wipe disks to UK DOD standard 5220.22m. Does anyone know of any products we might use on an SP system running AIX 4.3
I keep saying that half an hour with my Black and Decker would be far more effective but you know what security types are like ;-)
Thanks
On the internet no...
Thanks ZaSter - have a star. There's a typo in your link - it's here but I'd got close enough with this list of up2date <-> yum equivalences - which does not have the answer so using your doc number enabled me to find it.
On the internet no one knows you're a dog
Columb Healy
On my old systems I used to run
up2date --update --downloadto retrieve the files andup2date --update to apply the patches.
Now I've inherited a RHEL 5 server with yum instead of up2date. I've googled all I can but I can't find the equivalent of up2date --update --download for yum. Any pointers...
You can't do this using ssh in a raw state. There are two alternatives
Use public/private key pairs - this negates the needs for passwords - including the risks of having passwords in clear text in scripts! Google "ssh keys" for many sites which will help
Use 'expect' - this is a useful utiliy...
...structure.
Note that the -exec parameter to find opens a new process for each file found - as does your for loop so, for functions that take multiple files the options are
find . -name "*.exe" | xargs do_somethingand
do_something *.exe
On the internet no one knows you're a dog
Columb Healy
Why bother excluding - compress will spot the compressed files and skip them. You will get lots of warning messages but, if you're confident the rest works
find dir1 dir2 dir3 -type f -mtime +2|xargs compress 2>/dev/null
On the internet no one knows you're a dog
Columb Healy
Maybe you should buy your sysadmin a beer and suggest he allows you to use sudo to grep out the entries from sulog.
On the internet no one knows you're a dog
Columb Healy
...Tuesday
;; #End your case with a double semi colon
Thursday)
#Do Thursday specific stuff here
process_data Thursday
;;
*)
#Default case - non Tuesday and Thursday stuff if required
;;
esac #End of case statement.
On the internet no one knows you're a dog
Columb Healy
Sorry - addendum
The /var/adm/cron/log file exists on AIX but not on RedHat Linux so it looks like this method is dependant on your flavour of Unix/Linux
On the internet no one knows you're a dog
Columb Healy
Try grep -q Failed /var/adm/cron/log && echo "A cron job has failed" | mail -s "cron job failure" me@myserver
Of course you would have to reset the log file after every failure and the script could be nicer but you get the idea.
On the internet no one knows you're a dog
Columb Healy
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.