Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations TouchToneTommy on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

ignoring dups

Status
Not open for further replies.

jvmerc

Programmer
Jun 7, 2006
10
have large file with groups of looping records. I need to ignore loops based on 1st record of each loop.

the duplicate records aren't sent in any partcular order and usually aren't sequential so can't perform sort.

I tried function 'unique' but loops are all unique its first rec that isn't.

think should be simple but can't figure

using 671 windows

 
can you give an example of the file structure?

it sounds like you need to build a list of unique loop headers and then use the lookup function to find the first matching loop - this would be pretty slow though for a big file.

Cheers,
Olly.
 
Might need to pre-process the data. Send to a file, sort, etc. If this was UNIX, you could script this, or if you have PERL on Windows, you might be able to do this.

Would need a small sample of the data to get a better idea of what is needed.



BocaBurger
<===========================||////////////////|0
The pen is mightier than the sword, but the sword hurts more!
 
file like

initiator 1 + data
initiator 2 + data
initiator 1 + data
initiator 2 + data
initiator 2 + data
initiator 2 + data
initiator 1 + data
initiator 2 + data
initiator 1 + data
initiator 2 + data

talked with coworker and sort will work. he said use search or lookup but hoped for something faster. sort takes 12 minutes and lookup adds more time. For some reason search dropping records.
 
extracted all recs with init '1' to extract file and use orignal file and extract file.

have map that compar each record loop to extract file using searchdown. works good with small files but when extract file is larger 25 meg processing grinds to halt.

takes hours to process file with 36 meg extract file.
 
That kind of processing is better done in a database.


BocaBurger
<===========================||////////////////|0
The pen is mightier than the sword, but the sword hurts more!
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top