Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations strongm on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Fastest Way to Read a File

Status
Not open for further replies.

BAWC01

Technical User
Nov 23, 2005
79
CH
Hi

I have to read a file containing the following

Code:
16.11.05 00:06:02 SMP _ULD_PRIMOS 'PAG 58070BA' 0 'STORAGE' 'BA 164  15NOV05' 'B
A 269  16NOV05'
16.11.05 00:06:02 SMP _ULD_PRIMOS 'PAG 58070BA' ''
16.11.05 00:06:02 SMP _ULD_ENTRY: 'PAG 58070BA' 'UAC77A07'
16.11.05 00:06:02 SMP _MSG_UNIT_UPD: 'PAG 58070BA'
16.11.05 00:06:02 SMP _UNIT_UPD: I 'PAG 58070BA' 'BA 164  15NOV05'
16.11.05 00:06:02 SMP _PARCEL_ENTRY: 'PAG 58070BA' 'BA 164  15NOV05' 0
16.11.05 00:06:02 SMP PUT_STS: allive
16.11.05 00:07:01 SMP _CYCL: 11 FN-Name: 'FLIGHT_BUILD_COMPLETE'
16.11.05 00:07:01 SMP _PRO_FLIGHT_BUILD: 0 flights
16.11.05 00:08:01 SMP _CYCL: 1 FN-Name: 'ULD_ANNOUNCED'
16.11.05 00:08:14 SMP _CASE: received fcode: 3424
16.11.05 00:08:14 SMP _ULD_ARRIVE: 'AKE 32178BA' 'UBB19B1'
16.11.05 00:08:14 SMP PUT_STS: allive
16.11.05 00:08:14 SMP PUT_STS: allive

What I'm doing at the moment is reading is each part e.g.

SMP _ULD_ARRIVE
SMP _PRO_FLIGHT_BUILD
...
...
...

using an If/Else statement and processing each line and loading each piece of data into a seperate table.

Is this the best and fastest method.

Any help would be great

Thanks

Phil

 
Do you mean this is a text file and you're opening the text file, reading each line of data, and appending to the relevant table dependant on the second field (the code beginning with "SMP")?

If I've understood correctly then the fastest way would probably be to import the whole file into a temporary table with an index on the second field and run a series of append/delete queries to move the data into the permanent tables.

Although this will probably be faster it has disadvantages too - use of temporary tables will mean the database will grow more quickly and require more regular compaction.

Of course I may have completely misunderstood the question. :)

Ed Metcalfe.


Please do not feed the trolls.....
 
Phil,
Ed is right, you can not beat SQl for speed. Look at the transfertext method to import everything in and do what Ed said with append, delete, and or make table queries.
However, I was just doing the same thing, but my text files had about 3 million lines. It was more efficient for me to read line the text files, and then do a series of if then checks on each line. I would then write the specific lines to new text files. Each seperate text file then could be manually imported or imported using a transfer text method into its own table. The whole import process then took about 3 minutes for a million lines.
 
Ed/MajP

You are both correct in assuming this is what I am doing. I evaluate each line of the text file and then depending on what it is read it the line and pass the string to the appropriate function.

I will give Ed's suggestiona try but, one question because the line vary in length how do I evaluate the line of text and then pass each line into the appropriate table

Thanks

Phil

 
If you lines have different lengths you are OK if the longer lines have the same fields as the short lines, but some additional lines. In other words the short lines have fields 1,2,3,4 and the long lines have fields 1,2,3,4,5,6,7. But if your long lines have fields a different order or only some of the fields that the shorter lines have you will not be able to import into one table and the data make any sense. In that case you may want to consider doing what I said, and build seperate files first.
But assuming you can import the text file. You will go to the query builder and build queries from your imported table. Each query will return the records you need for each table. While in the Query builder you can turn these select queries into "make table queries". For each of these select queries, while in the query builder you go to the menu bar and select "Query". You will see an "Make Table" option. It will ask for a name of the new table. When you run the query it will make a new table of the fields and records you selected in that query.
If you have problems importing into one table, I have code to make individual files prior to importing.
 
Majp

Thanks for the info, much appreciated, could you post your code for me to have a look I am interesed in seeing how you do it and see it I could pinch some tips.

The file contains different lenght lines e.g.
Code:
16.11.05 00:43:58 SMP _CAGE_ARRIVE: 722 'CBB13B2F4'
16.11.05 00:43:58 SMP PUT_STS: allive
16.11.05 00:43:58 SMP PUT_STS: allive
16.11.05 00:44:01 SMP _CASE: received fcode: 3170
16.11.05 00:44:01 SMP _ULD_PRIMOS 'PAG 52166BA' 0 'STORAGE' 'BA 9016 14NOV05' 'B
         A 035  16NOV05'
16.11.05 00:44:01 SMP _ULD_PRIMOS 'PAG 52166BA' ''
16.11.05 00:44:01 SMP _CASE: received fcode: 3424
16.11.05 00:44:01 SMP _ULD_ARRIVE: 'PAG 58782BA' 'UBB07B2A'
16.11.05 00:44:01 SMP PUT_STS: allive
16.11.05 00:44:01 SMP _CYCL: 3 FN-Name: 'TRUCK_ENTRY'

So I think importing it via the import wizard won't work.

Thanks

Majp
 
Lets say that certain lines are structurally the same and contain the same type of data.
I do not know how you data is organized, but lets say we want to create seperate tables for these types of records:
_CASE: received fcode:
_ULD_PRIMOS
PUT_STS
Code:
Public Sub subMakeSeperateDataTables()
Const ForReading = 1, ForWriting = 2, ForAppending = 8
Dim fso As Object
Dim fileImportData As Object
Dim filePUT_STS As Object
Dim fileCase As Object
Dim file_ULD_PRIMOS As Object
Dim strReadLine As String
Dim strIdentifier As String

Set fso = CreateObject("Scripting.FileSystemObject")
Set fileImportData = fso.OpenTextFile("c:\ImportData.txt", ForReading, True)
Set filePUT_STS = fso.CreateTextFile("c:\filePUT_STS.txt", ForWriting, True)
Set fileCase = fso.CreateTextFile("c:\fileCase.txt", ForWriting, True)
Set file_ULD_PRIMOS = fso.CreateTextFile("c:\file_ULD_PRIMOS.txt", ForWriting, True)

Do While Not fileImportData.AtEndOfStream
  strReadLine = fileImportData.ReadLine
  strIdentifier = Mid(strReadLine, 23, 5)
  Select Case strIdentifier
    Case "_CASE"
      fileCase.WriteLine (strReadLine)
    Case "PUT_S"
      filePUT_STS.WriteLine (strReadLine)
    Case "_ULD_"
      file_ULD_PRIMOS.WriteLine (strReadLine)
  End Select
Loop
fileImportData.Close
filePUT_STS.Close
fileCase.Close
file_ULD_PRIMOS.Close
MsgBox "Completed: Seperate input files Created in C:"
End Sub
Now you have three files that have the same type of information, and more important have the same data structures. Now you can import in the three files as new tables.
Another important function to look at is the "Split" function. It takes a text string and breaks it into an array based on a specified delimiter. For example:
16.11.05 00:44:01 SMP _ULD_ARRIVE: 'PAG 58782BA' 'UBB07B2A'
into
splitArray(1) = "16.11.05"
splitArray(2) = "00:44:01"
splitArray(3) = "SMP _ULD_ARRIVE:
 
Majp

Thanks for the example I like the way you have done that and used the Split function. I will have a go at adapting it to my situation.

Many thanks for you help

Phil
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top