Hi everybody:
I have files where some values are -9999, and I would like to average some columns each five values but when the value is -9999 will skipped.
I tried this:
awk '{if ($5==-9999) skip; {s+=$5;++c}};{avg=s/5};{if(c==5){print $1, $2, $3, avg ; c=0; s=0}}' $fitxer.tmp2 > $fitxer.cl31...
...0.29531656E+05 0.29406262E+05 0.52036867E-01 -0.46166348E+02 0.78046381E+03
I would like to manipulate the data from second column, and for example if I do:
$2*1000
Actually I do not want the scientfic format
I have not output. So then somebody could give an idea?.
Thanks in advance
...f[$3]=sprintf("%s %s %s %3.3f %1.3f",$1,$2,$3,$4,$5)
if(NR==FNR)
{
if (f[$3]!="")
n[$1]=sprintf("%s %2.2f %3.2f %2.3f",f[$3],$7,$8,($8/100)*(6.1121*exp((17.502*$7)/($7+240.97))))
}
}
END{
for (i in n)
print n[i]
}'
but the outputs seems not correct because is very strange that in two...
Hi everybody:
I have this new problem.
I have two files which each one are separated with tabulations. The first have 5 columns where first is date, second hour and third is minute. In the other one, there are 13 columns and the three firsts columns are like the other one. My question is, how...
Hi everybody:
Somebody know or have a simple script which can calculate the integration from a database using the trapezoidal rule.
I have a file like:
0.490 952.296 284.19 11.3169 0.002481 0.488 0.01
0.500 965.57 288.25 11.3395 2.48E-03 0.048 0.01
0.510 964.47 288.25 10.8759 0.002479 0.036...
Thanks for your reply. But I tried to do like you said, but paste do not "paste" the file in order.
And I would like that when paste it do it like this:
paste file1 file2 file3 file10 file14 .... so on.
:D
...name10.dat name11.dat name30.dat
If I would like create one like:
name_total.dat
in order
I have done this two options:
for i in dn_rad_flux*.dat
do
cat $i >> dn_rd.dat
done
and
for i in dn_rad_flux*.dat
do
paste $i >> dn_rd.dat
done
Because I would like that each file be...
Thakns a lot for replies. I have solved the problem.
At the begining of the script, I have changed:
#!/bin/sh
for
#!/bin/bash
And now, it works correctly. Best regards for everybody.
Hi:
Thanks for your reply.
# You use bash, right ? ---- Yes I do.
# Will not work in ksh.
lines=( "1" "2" "3" "4" "5" "6" "7" "8" "9" "10" "11" "21" "31" "41" "51" "55" "57" "58" )
# Why this line ? Remove it.
# I did it now.
# ${lines[@]}
# Use distinct identifiers. This is not Perl.
# Of...
Hi everybody:
Hi everybody:
I try to print in new file selected lines from another file wich depends on the first column.
I have done a script like this:
lines=( "1" "2" "3" "4" "5" "6" "7" "8" "9" "10" "11" "21" "31" "41" "51" "55" "57" "58" )
${lines[@]}
for lines in ${lines[@]}
do
awk...
Hi everybody:
I have awk script which calcule the total sum of each column like this:
awk 'BEGIN{OFS=" };{s2+=$2;s3+=$3;s4+=$4;s5+=$5;s6+=$6;s7+=$7;s8+=$8;s9+=$9;
s10+=$10;s11+=$11;s12+=$12}; END{print s2, s3, s4, s5, s6, s7, s8, s9, s10, s11, s12}' file1 > file2.
so I would like reduce the...
Hi everybody:
Could anybody tell me how I can print from a file a selected rows with awk. In my case I only want print in another file all the rows from NR=8 to NR=2459 and the increment each 8 times.
I tried to this:
awk '{for (i=8; i=2459; i+=8); NR==i}' file1 > file2
and
awk 'NR%8==0 &&...
Sorry:
But I think that the question coulb be more simple.
Each block at the end of line manually I have seen that with one "supr" the next line becomes in the same than previous. Then I would like to know how I can eliminate the final carriage return that there is in each line.
Thanks. :D
Hi everybody:
I have a problem. I have a output files which have this pattern:
number1
--space
block1a - 7rows/10columns/65elements
--space
block1b - 7rows/10columns/65elements
--space
block1c - 7rows/10columns/65elements
--space
number2
--space
block2a - 7rows/10columns/65elements
--space...
Hi everybody:
I have a new problem, I have a file where the data is stored like:
1.000E+02 6.549E+01 4.252E+01 2.755E+01 1.786E+01 1.163E+01 7.660E+00 5.140E+00 3.550E+00 2.560E+00 1.950E+00 1.570E+00 1.330E+00 1.190E+00 1.100E+00 1.040E+00 1.010E+00 9.800E-01 9.600E-01...
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.