Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations gkittelson on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

porting sco scripts to red hat

Status
Not open for further replies.

crimzon

Programmer
May 15, 2002
2
US
I am working on translating scripts written to run on sco
5.05 and 5.06 to function on red hat 7.3 boxes.

I am looking for resources to point out sco specific syntax/commands that hinder this translation.
I haven't found much luck with the rosetta stone, or misc. searches.

I am getting errors refering to programs that aren't found
(ie showcolor etc), and the terminal is spitting garbage
mixed with normal display text. There seems to also be an
issue with the handling of control characters between these
platforms.

anyone know of resource for this type of info?
 
This is a major pain in the A$$ for SCO folk. As you say, under Linux (all versions) commands are not found, syntax is different, exit codes differ; I believe it is easier to rewrite your scripts. Linux tends to be verbose with argument specifications and supports only a few of the single character arguments in traditional unix. I am not yet convinced that this is for the better.

I have tried to do this: establish a file named '/etc/GLOBALOS'. I have one of these files for SCO, RHlinux, and Caldera linux. Within that file I establish arguments for each program that my scripts use. For example, a few lines from the SCO version would look like this:
Code:
  ECHOB="/bin/echo"; export ECHOB
Code:
  ECHOC="/bin/echo -c"; export ECHOC

In the RHlinux version these would look like this:
Code:
  ECHOB="/bin/echo -e"; export ECHOB
Code:
  ECHOC="/bin/echo -n"; export ECHOC

In my scripts I use the reference instead of calling the program directly, ie:
Code:
  $ECHOC "Enter Yes or No"

This is not the best solution but it does provide a basis for script uniformity. I would also appreciate any additional insight to this situation.

Gafling
 
Gafling:

You are so right about moving scripts between Bash and Korn being a major pain. Your ECHOB & ECHON is a good solution for the echo problem. A peer of mine suggested a function:

unset echo_n
if test "X`echo -n`" = "X-n"; then
echo_n() { echo "$@\c"; } # unix

else
echo_n() { echo -n "$@"; } # linux
fi

string="a line of text"
echo_n $string

If the test on 'echo -n' fails we know our system is non-linux.

I guess you really need to test your scripts when moving between the two environments. I don't have time to elaborate, but the value of the internal field seperator, IFS, is different on bash vs. korn.

Not only the shell, but some of the commands can be different. The tr command, unless it's GNU tr, appears to be non-POSIX compliant on unix systems like SCO Open Server V and Solaris 7. tr doesn't handle character classes correctly.

Regards,


Ed
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top