Hi Feherke,
Here is the update.
First test.
Complete process finished in 2 hours and 40 minutes.
Second test.
(I re-arranged the columns going to my output file)
Completed in 3 hours and 7 mins.
Appreciate your help on this, and a big thank you.
Rgs,
Sushil...
I've fixed the change, pl. take a look it might not be that efficient.
#!/usr/bin/awk
BEGIN {
FS = "|"
OFS = "|"
for (hour = 0; hour < 24; hour++)
for (minute = 0; minute < 60; minute++)
timepart[sprintf("%02d%02d", hour, minute)] = int((hour * 60 + minute)...
Actually, incoming file has 17 columns. I gave you just the initial/front two columns.
Basically, the original file will be
"ID |DATE |COL1|COL2|COL3....
1234|2015-01-01 00:01:15file is "|ABC|DEF|123....
And we have inserted columns and computation after DATE, which is ok. Those new...
Thanks and appreciate your help. I did some testing, timing looks good on test data. I am going to test the
results on live data over the weekend and post you the results. Sorry but I have to make two changes, Date and
time is coming as 201706070050130 and not as 2017-06-07 05:01:30, I was able...
Thanks for your response, here are the answers to your question.
1) 3 steps can be one single awk script.
2) Timemask calculation.(used excel file to generate.)
hh:mm:ss TimeMask
00:00:00 -> 1
00:15:00 -> 2
00:30:00 -> 3
00:45:00 -> 4
01:00:00 -> 5
01:15:00 -> 6...
Friends,
New to the forum, posting for the first time. Looking for some help from the community,
will really appreciate if you folks could provide some input and share some knowledge.
I am trying to achieve 3 things together, source file is around 800 million records
(17 columns) and takes...
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.