Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations strongm on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Reading from textFile

Status
Not open for further replies.

amerbaig

IS-IT--Management
Sep 4, 2001
58
CA
I have a tab delimited text file having 70000000 rows. I want to read each row, need to do some processing like searching a record on the basis of a specific column. Then want to insert a record in a separate table. I need to do this untill end-of-text file, using PL/SQL. What is the best way of doing it?

Regards
Amer
 

I suggest that you load this tab-delimited file into a staging table in your DB, then from that table, perform your required processing.

You can load the file using sqlldr (SQL Loader) utility, create a control file that loads tab-delimited file. For example;

LOAD DATA
INFILE "/source/file/comma_delim.dat"
INTO TABLE staging_table
FIELDS TERMINATED BY '|' OPTIONALLY ENCLOSED BY '"'
(
COL1 CHAR NULLIF COL1=BLANKS,
COL2 CHAR NULLIF COL2=BLANKS,
COL3 INTEGER EXTERNAL NULLIF COL3=BLANKS,
COL4 INTEGER EXTERNAL NULLIF COL4=BLANKS
)
 
Yes, This one option. Is there any option, which canbe used to read data sequentially from text file, then doing some processing on it and then writing it as row to a table?

Amer
 

You can also use UTL_FILE package. You can read your file directly but I am not sure if it can recognize delimited files because it fetches the entire row, meaning you'll have to do a lot of programming to get your columns from your text file.

 
Yes, with UTL_FILE you'll pull an entire row of text in with each GET_LINE. You then have to break the row up (typically placing the contents of each field into a variable), do your processing, and insert the pieces as a row in your table. If you don't have too many columns to deal with it's not too bad. If you have many columns, then I would go with the staging table/SQL*Loader approach.
 
With that many records, I think you'd be better off creating the staging table and using SQL*Loader in direct path mode to get the data into the database. After that you can use SQL and PL/SQL to scrub the data and load it into its final resting place.
 

You may want to use direct option of SQLLOADER on this one because of the size of you source file.

 
Do you have access to a C compiler? (or COBOL or some other language with Pro* support) If so, I'd say that unless this is a one-time project you might want to use a 3GL with Embedded SQL. This will almost certainly perform better than UTL_FILE.

I'd use SQL*Loader if you don't have a lot of processing to do after loading the records. But if you do have a lot of processing to do in PL/SQL, performance for this part of the process may be poor. You can improve this with the new PL/SQL BULK operations capability in 8.1.6 and later, but I don't think you can do better than Pro*C or Pro*COBOL.
 
I can have access to Pro*C. Can you give me code to read from a text file, using Pro*C? Lest assume there are 1000 rows in a text file with 5 tab delimited columns. For column number 2 I have to write a SQL statement, reading some records from other tables using value of column 2 then rest 4 columns of text files and 3 columns of exracted values would be inserted into another table. Your help would be great.

Regards
Amer
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top