I have :
5 columns in file 1
11 columns in file 2
file 1 has more lines than file 2
cols 1, 2, 4, match info between the two files
based on the critieria of matching cols i want to extract cols 3,5 from file 1 and add them with file 2
I think this code is close to what im looking to do
awk...
I have :
5 columns in file 1
11 columns in file 2
file 1 has more lines than file 2
cols 1, 2, 4, match info between the two files
based on the critieria of matching cols i want to extract cols 3,5 from file 1 and add them with file 2
I think this code is close to what im looking to do
awk...
it is sybase db and looking on web(http://denis.sri.com/Oracle/Translator.html) NVL equivalent is ISNULL() which i tried to no avail
my example where i wrote blank there is no value, but i belive it is not NULL as above command doesnt work
i tried
select ISNULL(col1,col2) from table
I have 2 columns from sql extract and for each row either column has values in each row. I need to be able to combine the columns so there are no blank values.
example
column1 column2 column3 column 4column 5
blank abcdef 123 john lola
ghijkl blank 456 don sue
mnop blank 789 cliff jane
blank...
I'm trying to use 1 script for muliple boxes with same username to backup crontabs
Her is what I have and the problem is the hostname variable wont output to the name of the file
#!/bin/ksh
CHOST=$HOSTNAME
CDATE=`date "+%Y%m%d"`
/usr/bin/crontab -l >...
How can I combine query 1 and query 2 into one big query?
query 1:
SELECT name, crdate from sysobjects where type='U' and name like '%19303%' ORDER BY crdate desc
result 1:
name crdate
11608#M0DAA2D58_TMP 2010-01-06 18:15:23.606
11608#M27042D58_TMP 2010-01-06 18:15:23.606...
i have 2 files and i want to compare
i currently cat the files and awk print $1, $2 and doing if file1=file2 then fail, else exit 0
what i want to do is compare values, with column 1 being a reference i want to compare line by line and then do:
the values not matching is a success
the values...
exsnafu,
i solved the part 2 of the problem of script where it was also trying to change 0.txt files as well. I changed the search criteria
for file in `ls *_*_!(0).txt';
do
new=`echo $file|awk -F_ '{print $1}'`
echo $file $new_1.txt
done
now the 1st issue still...
im running a sh script on Sun OS
SunOS 5.10 Generic_118833-36 sun4u sparc SUNW,Sun-Fire-V490
here is the contents of script:
cd dir
ls *_*.txt | nawk '!/_[01]\.txt$/{f=$1;sub(/_.*\./,"_1.",f);printf "mv %s %s\n",$1,f}' | sh
exit 1
result set of script fix_awk.sh:
$ fix_awk.sh
awk: syntax...
exsnafu,
thanks for the input, however there are 2 issues occuring when running this.
1) The echo shows the files would be replaced by (blank).txt
2) It is trying to replace files that are *_0.txt which should not be happening. Per the orginal description those files are ok.
Result set...
im runnning /usr/bin/ksh
I ran both suggestions..both errored out...
for fehreke i got the following error after i removed the echo:
"${f/_*./_1.}": bad substitution
for PHV i got the following error:
awk: syntax error near line 1
awk: illegal statement near line 1
Hi
Im stuck in the following issue:
I have a directory where correct file names are:
*_0.txt
*_1.txt
Then through a publisher some incorrect file names are produced:
*_12.txt
*_0123.txt
*_04321.txt
all files that are incorrect above need to be replaced with ending in *_1.txt
therefore...
2 java jobs being called in ksh script and error trapping wont work when one or both fails
Couple additonal points:
- this use to work on HP but doesnt after we migrated to solaris
- this type of error trapping works in solaris for other scripts calling one job
- we cannot make the jobs run...
Im trying to execute a shell script in crontab, however the script needs an option to be specified to work. I dont know how to get this to work.
- I have called the shell script with option as entry in the crontab
- I have nested the shell script with option in another shell sciprt and called...
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.