sbroomfield
Programmer
Can anybody help me?
I have a large file that I am going to bcp to a database.
To ensure that the file loads correctly I need to ensure that each entry is unique. I have run:
sort inputfile.txt | uniq
but the 1st 2 lines of the file will not become one.
Other duplicates have been eliminated.
The file looks like:
20001206,02:32:52,CLOSED,0,6958368,-1,-1,-1,-1,-1,-1,0,-1,4150573
20001206,02:32:52,CLOSED,0,6958368,-1,-1,-1,-1,-1,-1,0,-1,4150573
20001206,02:33:22,CLOSED,0,6961976,99,0,0,2393,1487,0,0,-1,4150566
20001206,02:33:53,CLOSED,0,6962168,100,0,0,1908,1466,0,0,-1,4150566
20001206,02:34:23,CLOSED,0,6962152,100,0,0,2277,1472,0,0,-1,4150565
20001206,02:34:53,CLOSED,0,6962152,100,0,0,1887,1410,0,0,-1,4150565
I have a large file that I am going to bcp to a database.
To ensure that the file loads correctly I need to ensure that each entry is unique. I have run:
sort inputfile.txt | uniq
but the 1st 2 lines of the file will not become one.
Other duplicates have been eliminated.
The file looks like:
20001206,02:32:52,CLOSED,0,6958368,-1,-1,-1,-1,-1,-1,0,-1,4150573
20001206,02:32:52,CLOSED,0,6958368,-1,-1,-1,-1,-1,-1,0,-1,4150573
20001206,02:33:22,CLOSED,0,6961976,99,0,0,2393,1487,0,0,-1,4150566
20001206,02:33:53,CLOSED,0,6962168,100,0,0,1908,1466,0,0,-1,4150566
20001206,02:34:23,CLOSED,0,6962152,100,0,0,2277,1472,0,0,-1,4150565
20001206,02:34:53,CLOSED,0,6962152,100,0,0,1887,1410,0,0,-1,4150565