I have a large data set and I need to replace the content of the data set by a number based on a text search. However, I want to keep the same format as it was in the original file.
Here is a small example of the file:
15 1.000(10e12.4) -1 CONSTANT
0.0000e+00...
I did, however, there is only one space between them , not two spaces.
What I mean is: 1.4500e+04 1.4500e+04 1.4500e+04
It should be: 1.4500e+04 1.4500e+04 1.4500e+04
There should be two spaces between them.
Hi Feherke,
It works excellent! I already tested in my 1 GB dataset.
However, I have some formatting issue question.
Is it possible for you to manupulate the code little bit so that the data is in fortran format as in the data set, i.e. it should look like this:
11 1.000(10e12.4)...
Hi feherke,
The code do not work on the large data set. I am attaching a small part of the dataset. If you make a search "Hydraulic Conductivity Layer 1", the data sets below that layer needs to be changed untill it reaches "Bottom Layer 1".
I have total of 20 such sets in one large file...
I have a large file with different information, but in a structured fashion. I just
want to change the values of one structured information. A small example will help as what exactly I am talking about:
Example set:
0 0 -99 1
22465
1 2
5 6 7
11 1.000...
I am trying to subtract a value of 25 from the cells of a matrix.
Example:
2 3 6 9 12
-3 4 7 -2 -6
3 10 4 8 14
After subtracting 25 from the cells of the above matrix, it should be:
-23 -22 -19 -16 -13
-28 -21 -18 -27 -31
-22 -15 -21 -17 -11
How would I do this in awk95. Any help...
I am sending this email, which I had sent earlier but with slight different approach. I have two large matrix, with same dimensions. I want to compare matrix 1 file with matrix 2 file.
I just now want to subtract the cells of matrix 1 from matrix 2.
Example
Matrix 1
2 3 6 9 12
-3 4 7 -2...
I have two large matrix, with same dimensions. I want to compare matrix 1 file with matrix 2 file.
The cells where matrix 2 file is greater than matrix 1 file, change those cells in matrix 2 by subtracting
by -5.
Example
Matrix 1
2 3 6 9 12
-3 4 7 -2 -6
3 10 4 8 14
Matrix 2
3...
I am trying to add comma to data in two columns. I have tried to bring into excel, however, the data size is large.
My data is in this format:
1678.3 2.567 3.1267
4562.1 367.4 7.98
etc
I want the data to be in this format:
1678.3,2.567,3.1267
4562.1,367.4,7.98
Thanks.
Hi friends,
I need a help to determine an average of a large matrix, which has 328 rows and 368 columns.
I am including an example as what I achieve to do:
Example matrix (3 by 4):
3 4 8 9
1 6 9 6
2 3 4 8
The average of first column (3, 1, and 2)is: 2
The average of second column (4,6,3)...
I want to break a large file into multiple file based on first two columns:
For example my file is as follows:
87.73,49.62,-45.000000,0,-261581.000000,0.000000,1,1
87.73,49.62,-45.000000,0,-261581.000000,0.000000,2,2
87.73,49.62,-45.000000,0,-261581.000000,0.000000,3,3...
I have a large dataset, where I want to break the file into smaller files after certain number of rows.
Example, my dataset looks like:
G-1222
1 OBSERVED
0.0000 0.0000
583 COMPUTED
1 3
7 9
15 12
G-1736
1 OBSERVED
0.0012 0.000
583 COMPUTED
5 7
3 6
6 8
The final output will be in the form...
Hi futurelet,
Thanks for the code. It works fine. If I want to use the same code to extract first 500 lines, instead of TAIL what command should I replace with?
Thanks.
I have a large dataset and I cannot open the file, so I just want to read the last 500 lines of the dataset. I am sure there is a simple command in awk that will do so. Could any one please help me in this regard.
Thanks.
Hi PHV,
I have written my input file as TEST.DAT
Here is the input file:
1 2 3 A B
5 6 1 2 3
0 0 CONTENTS 1 0 2
1 2 3 4 5
1 3 T B 6
0 0 CONTENTS 1 0 2
P Q W 4 7
0 0 CONTENTS 1 0 2
I have written your code in a TEST.AWK file:
{ !/^CONTENTS/ }
I then ran awk95 from the directory which...
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.