I have a text file with contents as follows that will be loaded into a database table:
600193439^_600076830^_600193439^_2^^
600193430^_600076827^_600193430^_6^^
600192222^_600076830^_600191112^_2^^
600333333^_600076830^_600193111^_2^^
...
...
The table has columns 2 and 4 as the primary keys, so i want to merge the duplicate entries of column 2 and 4 in the text file to the last entry
in the above example the output file should be:
600333333^_600076830^_600193111^_2^^
600193430^_600076827^_600193430^_6^^
so that the database loading of the file does not break.
Also I want to write all duplicate entries into a seperate file to research later :
in the above example the researchfile should be:
600193439^_600076830^_600193439^_2^^
600192222^_600076830^_600191112^_2^^
600333333^_600076830^_600193111^_2^^
I sorted the file using the sort command but I am stuck after that ..
sort -t'^_' -k 2,4 $INPUTFILE
Any help is highly appreciated.
600193439^_600076830^_600193439^_2^^
600193430^_600076827^_600193430^_6^^
600192222^_600076830^_600191112^_2^^
600333333^_600076830^_600193111^_2^^
...
...
The table has columns 2 and 4 as the primary keys, so i want to merge the duplicate entries of column 2 and 4 in the text file to the last entry
in the above example the output file should be:
600333333^_600076830^_600193111^_2^^
600193430^_600076827^_600193430^_6^^
so that the database loading of the file does not break.
Also I want to write all duplicate entries into a seperate file to research later :
in the above example the researchfile should be:
600193439^_600076830^_600193439^_2^^
600192222^_600076830^_600191112^_2^^
600333333^_600076830^_600193111^_2^^
I sorted the file using the sort command but I am stuck after that ..
sort -t'^_' -k 2,4 $INPUTFILE
Any help is highly appreciated.