Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations SkipVought on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Question about starting scripts with ssh on remote hosts

Status
Not open for further replies.

appi

IS-IT--Management
Mar 17, 2003
296
0
0
CH
Hi Folks,
I do need to start a ahell script on our jump server which works via ssh on several remote hosts.
The Script will only be placed on the central server.
In the script there will be called an sql script - so both, shell and sql script will be placed only on the central server.
The ssh is set up to work without Password - so this will be not a factor.
Any help is welcome
I just checked posts, but I found only starting commands (which are running remotely, but started remotely after ssh connect). No Posts with starting a locally script.
regards
appi
 
something like

ssh user@webserver sudo /etc/init.d/apache2 restart


Mike

"Whenever I dwell for any length of time on my own shortcomings, they gradually begin to seem mild, harmless, rather engaging little things, not at all like the staring defects in other people's characters."
 
Hi Mike,
sonds good, but it looks that your script lay on the remote host. I have both (shell and sql) on the host where i need to run the ssh from. And I guess sudo is not in place.
Any other hints ?
regards
Uwe
 
Not trivial unless you read the script into memory:
Code:
code=(cat $scriptname)
..and then copy it to a file on the remote host first.
Code:
code=$(cat scriptname) ; ssh $user@host "echo $code > /tmp/tmpscript ; sh /tmp/tmpscript"
 
You need to test and test again but
Code:
b01301$ cat test.ksh
#!/bin/ksh
uname -n
id

b01301$ ./test.ksh
b01301
uid=16180(b40217) gid=1200(tsg) groups=202(tsc2),203(coltest),206(tsgmail)

b01301$ cat test.ksh | ssh website@b05801
Pseudo-terminal will not be allocated because stdin is not a terminal.
[YOU HAVE NEW MAIL]
b05801
uid=206(website) gid=209(website) groups=1(staff)
Ok, so what's going on.
In the first instance I created a very short script - test.ksh which runs uname -n to show the host and id to show the user.
I then ran this script to show that I'm b40217 on b01301.
Then I catted the script into 'ssh website@b05801' to show that it now gives user 'website' on host 'b05801' - that is its being run remotely.

You'll need a lot of care over environment settings but
Code:
cat script | ssh remote_user@remote_host
might well work.

Ceci n'est pas une signature
Columb Healy
 
One option:

Use nfs, put the shared stuff in the share directory, export it out or use /net to access the shared script.

eugene
 
... or perhaps just scp it on to the host and then run it?

Annihilannic.
 
Hi,
all good thoughts, but in our environment / script it is not really possible.
scp is not useful, because we want to store the scripts only on the jump-server
tunneling will not been set up because we will go to 35 server
cat with pipe might be a solution, but .... in the shell script I call a sql-script with sqlplus for oracle. I guess this will make it to complicated to create one script. It will be too big, and also it should be started from crontab on jump-server

regs
Uwe
 
What about using rsync to move the files on and off the servers?

Mike

"Whenever I dwell for any length of time on my own shortcomings, they gradually begin to seem mild, harmless, rather engaging little things, not at all like the staring defects in other people's characters."
 
I can't see the objection to copying the scripts across. Something along the lines of
Code:
#!/bin/ksh
for host in host1 host2 host3
do
  scp /store/${host}_script ${host}/tmp || { echo unable to dopy to $host; continue; }
  ssh $host "ksh /tmp/${host}_script"
  ssh $host "rm /tmp/${host}_script"
done > /path/to/log/file 2>&1
Whilst it's not exactly what you want it's simple to implement and is a suitable kludge.

Alternatively I vote for appi's suggestion. I use a version of this for user creation across multiple hosts. The actual script is
Code:
#!/bin/ksh

[[ $# -ne 1 ]] && { echo invalid paramter count; exit 1; }

[[ -f $(dirname $0)/$1 ]] || { echo Unable to find $1; exit 1; }

chmod 700 $(dirname $0)/$1 || { echo Unable to change permissions on $(dirname $0)/$1; exit 1; }
[[ $(head -1 $(dirname $0)/$1) = "#!/bin/ksh" ]] || { echo invalid file format; exit 1; }
for box in host1 host2 host3 host4
do
  ssh $box "mount central:/software /mnt && /mnt/scriptdir/$1 && umount /mnt"
done
which is used on a near daily basis to run the same script which exists on the 'central' host on 'host1', 'host2' and 'host3'

Ceci n'est pas une signature
Columb Healy
 
Thanks guys,
I will try some of those suggetions ans will see if the cacscding use of scripts is possible or if I need to implement the sql code directly in the shell script.
 
Overall, you are always going to have problems doing this. One-statement remote execution is fine for simple things, but once your needs exceed a certain level of complexity, it breaks down. Keeping track of what needs to be escaped, how many times, and where substitutions and expansions get applied gets messy.

It may seem like a kludge, but generally speaking, copying the script over, remoting executing it, then deleting it afterwards tends to be the least complex way of doing it.

Remember that eventually someone is going to have to come behind you and figure out what you are doing. That might be you ;) Personally speaking, I'd rather see a 3-line "copy, remote execute, delete" than a single line of cat-tracks.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top